All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @secure , you have to insert the three conditions, something like this: <input type="checkbox" token="field1"> <label>Test</label> <choice value="your_field>90">greather than 90<... See more...
Hi @secure , you have to insert the three conditions, something like this: <input type="checkbox" token="field1"> <label>Test</label> <choice value="your_field>90">greather than 90</choice> <choice value="your_field>=10 AND your_field<=90">between 10 and 90</choice> <choice value="your_field<10">less than 10</choice> <prefix>(</prefix> <suffix>)</suffix> <delimiter> OR </delimiter> </input> You can adapt this example to your use case. Ciao. Giuseppe
Hello Splunkers    Have any of you worked with log files of Cisco equipment: - AP 9130 - WiFi Controller 9840   I am interested in how to add more information to log files. And also: perhaps... See more...
Hello Splunkers    Have any of you worked with log files of Cisco equipment: - AP 9130 - WiFi Controller 9840   I am interested in how to add more information to log files. And also: perhaps someone can share a use case for creating dashboards for this equipment.   Thanks in advance for your answers.
The following are the configurations that we have made in props and transforms conf files props.conf [source::logserver] TRANSFORMS-enrich = malicious_ip transforms.conf [malicious_ip] filenam... See more...
The following are the configurations that we have made in props and transforms conf files props.conf [source::logserver] TRANSFORMS-enrich = malicious_ip transforms.conf [malicious_ip] filename=malicious_ip.csv match_type=WILDCARD(dst_ip) INGEST_EVAL = json_data=lookup("malicious_ip.csv", json_object("dst_ip" , dst_ip), json_array("creationdate", "confidence", "tags")) INGEST_EVAL = creationdate=json_extract(json_data, "creationdate") INGEST_EVAL = tags=json_extract(json_data, "tags") .csv file is located at /opt/splunk/etc/system/local/lookups/malicious_ip.csv and it is accessible through search and reporting app. still the logs are not getting enriched at ingest time. Kindly provide correct conf for props and transforms and any additionaly observation / recommendation. thanks and regards  
Please share the source of your dashboard or at least a cut down version showing what you are trying to do
  i have a table with values and based on the input checklist selection i want to display the table rows i have a checkbox option enabled in the panel so that users can select the checkbox and acc... See more...
  i have a table with values and based on the input checklist selection i want to display the table rows i have a checkbox option enabled in the panel so that users can select the checkbox and accordingly table displays the value greater than 100 between 10 to 100 less than 10 How can i use the conditional operator in the input type as i try to add the value > or < in the input the search doesn't work in the panel  
You can enrich your data at index time using INGEST_EVAL and your CSV file.  See https://docs.splunk.com/Documentation/Splunk/9.4.0/Data/IngestLookups
@kiran_panchavat , I have tried the same as per the community page but still its the same the data are not getting parsed.
Hi Kiran, Thanks. Replace values in an internal field - has solve the problem.
This looks like it might be JSON. Have you looked at the json functions?
Let me quote myself from earlier: "Just one important thing - if you want to enable TLS, get yourself a CA and issue proper certificates. Using self-signeds everywhere will not help you much securit... See more...
Let me quote myself from earlier: "Just one important thing - if you want to enable TLS, get yourself a CA and issue proper certificates. Using self-signeds everywhere will not help you much securitywise and you'll run into troubles when trying to validate them properly (which might be your case)". If you created self-signed certs for your components, you will have problems validating them. If you have a CA from which you issued those certs, you've probably not configured the root CA's cert as trusted.
Does Splunk already know what host name has to be replaced for each ip address? For example, are there some other events, or even the same events, which hold this relationship, or do you have a looku... See more...
Does Splunk already know what host name has to be replaced for each ip address? For example, are there some other events, or even the same events, which hold this relationship, or do you have a lookup holding this information?
And how is this not what you want?
@cshewalkar  Are you looking to change the host value? You can change the value using replace command.  https://docs.splunk.com/Documentation/SplunkCloud/latest/SearchReference/Replace  I ho... See more...
@cshewalkar  Are you looking to change the host value? You can change the value using replace command.  https://docs.splunk.com/Documentation/SplunkCloud/latest/SearchReference/Replace  I hope this helps, if any reply helps you, you could add your upvote/karma points to that reply, thanks.
@anandhalagaras1  Have you checked this community page? https://community.splunk.com/t5/All-Apps-and-Add-ons/Splunk-Add-on-for-Microsoft-Cloud-Services-How-to-edit-props/m-p/242367 
Hi Team,  Need some help, while running below query I get host IP i.e. 10.65.x.x in Number display visualization but I need to replace with name "xyz" index=network ((host=10.65.x.x) AND ... See more...
Hi Team,  Need some help, while running below query I get host IP i.e. 10.65.x.x in Number display visualization but I need to replace with name "xyz" index=network ((host=10.65.x.x) AND ((Interface Ethernet1/50 is *) OR (Interface Ethernet1/49 is *) OR (Interface Ethernet1/3 is *))) | table message_text, host Attached is the screenshot Can you assist me what needs to be done to solve this issue.
@Amira  Hey, you can follow this document for more information. Installation and configuration overview for the Splunk Add-on for Citrix NetScaler About the Splunk Add-on for Citrix NetScaler - Sp... See more...
@Amira  Hey, you can follow this document for more information. Installation and configuration overview for the Splunk Add-on for Citrix NetScaler About the Splunk Add-on for Citrix NetScaler - Splunk Documentation https://docs.splunk.com/Documentation/AddOns/released/CitrixNetScaler/Installationoverview  https://docs.splunk.com/Documentation/AddOns/released/CitrixNetScaler/Install  https://docs.splunk.com/Documentation/AddOns/released/CitrixNetScaler/Setup  I hope this helps, if any reply helps you, you could add your upvote/karma points to that reply, thanks.
@Jean-Sébastien  You can use rex command. The rex command matches the value of the specified field against the unanchored regular expression and extracts the named groups into fields of the corres... See more...
@Jean-Sébastien  You can use rex command. The rex command matches the value of the specified field against the unanchored regular expression and extracts the named groups into fields of the corresponding names. https://docs.splunk.com/Documentation/Splunk/9.4.0/SearchReference/Rex 
Hello @Jean-Sébastien  You can use regex. This will create a new field called output that contains the values running, drinking, and walking. Let me know if you need more assistance!    
@bowesmana  let me clarify you the exact issue. We are ingesting logs from syslogserver in real time manner (meaning that as and when the logs are getting generated at the device, immediately the spl... See more...
@bowesmana  let me clarify you the exact issue. We are ingesting logs from syslogserver in real time manner (meaning that as and when the logs are getting generated at the device, immediately the splunk forwarder is forwarding it to splunk for indexing. Now we are having threat intel in the form of .csv file containing multiple headers viz date, ip, valid_from, valid_until etc. we have ingested this csv file in lookup and it is accessible through searh and reporting. Our architecture is having one master/search-head and two indexers. We have configured deployment server on master and indexers (clients) and in sync with deployment server successfully. The deployment app has been created and is getting deployed on the clients also. The deployment app is aimed at enriching the logs with the threat intel in csv file. However, this enrichmet has to be done before the logs are getting indexed and any match of ip in the log event with the ip in csv should generate additional field "Add_field" which should also get indexed alongwith syslog logs. we have configured props.conf and transforms.conf in the deployment app, however exact configuration is not being achieved.  regarding your specific query about real time: when we say real time, it means that logs are getting enriched at the time of indexing and additional contenxtual information present in the threat intel is also getting indexed in additional fields. the query run on the logs therefore does not need any lookup to be incorporated in search query. the match of threat intel done today should stay in the logs in case the csv file is updated tomorrow.  looking forward for suitable solution / configurations to be done in props.conf and transforms.conf for index time enrichment (real time enrichment) and not search time enrichment. thanks and regards
Hello,  I have big and complete log and want to extract specific value.  Small part of log: "state":{"running":{"startedAt":"2024-12-19T13:58:14Z"}}}], I would like to extract running in this case... See more...
Hello,  I have big and complete log and want to extract specific value.  Small part of log: "state":{"running":{"startedAt":"2024-12-19T13:58:14Z"}}}], I would like to extract running in this case, value can be other .  Could you please help me ?