All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I have a below message. how can I only display ResponseID in output? thanks message: <?xml version='1.0' encoding='ISO-8859-1'?><Submission Id="12345" <LastName>XXX</LastName><ResponseID>137ce83fe8d... See more...
I have a below message. how can I only display ResponseID in output? thanks message: <?xml version='1.0' encoding='ISO-8859-1'?><Submission Id="12345" <LastName>XXX</LastName><ResponseID>137ce83fe8ddb052-1698535326634</ResponseID><Date>2023.10.28 23:23:14</Date>
Ok. We need to get the terminology straight. 1. There is no such thing as "summary index" as a type of index. Splunk has only two types of indexes - events and metrics. You can have a summary index ... See more...
Ok. We need to get the terminology straight. 1. There is no such thing as "summary index" as a type of index. Splunk has only two types of indexes - events and metrics. You can have a summary index as an index which gets your summaries but it's purely organizational issue. 1a. You can have have both summary events and any other kind of events in the same index. 2. There is summary indexing meaning a process in which you generate data which is saved into your indexes for summarizing purposes. 3. There is no such thing as commands in the index. Searches can read from an index and write to them but they are not in an index. So you're either using collect explicitly or it's done implicitly as a result of summary indexing option in scheduled search. 4. Indexes just hold data. They don't do anything with it. The data is either permanently transformed before being written to the index (that's what happens when the data is collected to the summary index) or is being dynamically transformed on read according to sourcetype/source/host definition (in your case it's the definition for the stash sourcetype). Index has nothing to do with it. The summary indexing in the scheduled search works the same way as the collect command does - the results are getting written to an intermediate csv file from which they are ingested into the destination index. But here you can't decide on the details as you can do with the manually spawned collect command. So either fiddle with the configuration described in the article I linked (might work, might not; haven't tried it myself), manually split the results on search (but that might be problematic if you have spaces in your field values; in such case you could try to delimit multivalued fields differently before collecting) or split your events so that you don't have multivalued fields before collecting the summaries.
Hello, we have a data center with several type of equipment such as servers, switches, routers, EDR, some IOT Sensors, virtualization and etc. Based on EPS, we need about 10 indexer based on splu... See more...
Hello, we have a data center with several type of equipment such as servers, switches, routers, EDR, some IOT Sensors, virtualization and etc. Based on EPS, we need about 10 indexer based on splunk recommendation. Now I want to  separate indexer to 4 cluster. one for servers, one for network device, one for services and last one for security such as Firewall and EDR.  each cluster has several indexer and each forwarder send data to the related cluster. data only replicate in the origin cluster not other clusters But I need each search head could search between 4 cluster. for example search for login failure in the all cluster (servers, network device and etc) could I have several cluster with one cluster master?   Best Regards
If you know all container names in advance, simply enumerate them.  One way to do this is to use foreach.   index=* Initialised xxxxxxxxxxxx xxxxxx |rex "\{consumerName\=\'(MY REGEX)" | stats coun... See more...
If you know all container names in advance, simply enumerate them.  One way to do this is to use foreach.   index=* Initialised xxxxxxxxxxxx xxxxxx |rex "\{consumerName\=\'(MY REGEX)" | stats count as Connections by Container_Name | transpose header_field=Container_Name column_name=Container_Name | foreach "Container A", "Container B", "Container C", "Container D" [eval <<FIELD>> = if(isnull('<<FIELD>>'), "(missing)", '<<FIELD>>')] | transpose header_field=Container_Name column_name=Container_Name | addcoltotals fieldname=Connections labelfield=Container_Name   (If you perform stats on Container_Name,  For example, if your data is missing "Container D", you get Container_Name Connections Container A 1 Container B 1 Container C 1 Container D (missing) Total 3 If your data is missing "Container C", you get Container_Name Connections Container A 1 Container B 1 Container D 1 Container C (missing) Total 3 And so on. Here is an emulation for you to play with and compare with real data   | makeresults | fields - _time | eval Container_Name = mvappend("Container A", "Container B"```, "Container C"```, "Container D") ``` data emulation above ```    
@gcusello thanks for your reply, i have checked the connection by telnet the Splunk it is successfully connected, also cross checked it by adding other path of log files. It is adding successfully.  ... See more...
@gcusello thanks for your reply, i have checked the connection by telnet the Splunk it is successfully connected, also cross checked it by adding other path of log files. It is adding successfully.  I have added the file path manually but still file is not showing on splunk GUI. Further going through the doc you provided hope it will help.
CI filed values won't be constant. Sometime it can contain 3 value, sometime 4 or 5 value with semicolon separated. But 1st word in CI filed is fix that is V2. How can we handle that with inline rex... See more...
CI filed values won't be constant. Sometime it can contain 3 value, sometime 4 or 5 value with semicolon separated. But 1st word in CI filed is fix that is V2. How can we handle that with inline rex or with props. Example: "CI": "V2;Y;Windows;srv048;LogicalDisk;C:", "CI": "V2;Y;Linx;srv048", "CI": "V2;LX;apple;rose;server",    
Hi @jip31, as you can read at https://docs.splunk.com/Documentation/Splunk/9.1.1/Knowledge/Aboutdatamodels , data models are tables containing the fields extracted from your events, but yu don't hav... See more...
Hi @jip31, as you can read at https://docs.splunk.com/Documentation/Splunk/9.1.1/Knowledge/Aboutdatamodels , data models are tables containing the fields extracted from your events, but yu don't have the events, so youcannot see then in search results. Data models are very useful because they are very fast in searching for fields. they have a very fixed syntax in the order of options (as oter Splunk commands) so you have to put exactly the option in the required order. Then when you use data model fields, you have to remember to use the datamodel name, so, in in your TEST datamodel you have the EventCode field, you have to use: | tstats count from datamodel=TEST where TEST.EventCode=100 Datamodel are very important when you have structured data to have very fast searches on large amount of data. Ciao. Giuseppe
@RSS_STT  You can also try adding this in props.conf. [cluster_test] EXTRACT-fields = "CI":\s"(?<CI_V2>.*)\;(?<CI_1>.*)\;(?<CI_2>.*)\;(?<CI_3>.*)\;(?<CI_4>.*)\;(?<CI_5>.*)\",     I hope this... See more...
@RSS_STT  You can also try adding this in props.conf. [cluster_test] EXTRACT-fields = "CI":\s"(?<CI_V2>.*)\;(?<CI_1>.*)\;(?<CI_2>.*)\;(?<CI_3>.*)\;(?<CI_4>.*)\;(?<CI_5>.*)\",     I hope this will help you. Thanks KV If any of my replies help you to solve the problem Or gain knowledge, an upvote would be appreciated.    
Hi @LearningGuy , yes your search give you a list of distinct values by ip: index=regular_index | stats values(company) AS company BY ip | table company ip but if you don't use "AS company" you d... See more...
Hi @LearningGuy , yes your search give you a list of distinct values by ip: index=regular_index | stats values(company) AS company BY ip | table company ip but if you don't use "AS company" you don't have this field in the following table command. Is this your question or do you have other doubt? Ciao. Giuseppe
Hi @Satyapv, let me understand: you want a link button that in your dashboard displays some panels instead other panels, is this correct? if this is your requirement, you should see in the Splunk D... See more...
Hi @Satyapv, let me understand: you want a link button that in your dashboard displays some panels instead other panels, is this correct? if this is your requirement, you should see in the Splunk Dashboad Examples App (https://splunkbase.splunk.com/app/1603) the Link Switches examples that, except for the colour of the button, solves your need. Ciao. Giuseppe
Hi I have created a basic datamodel called "TEST" I try to query on this datamodel with tstats but the only piece of code which return value is :   | tstats count from datamodel=TEST    But i c... See more...
Hi I have created a basic datamodel called "TEST" I try to query on this datamodel with tstats but the only piece of code which return value is :   | tstats count from datamodel=TEST    But i cant se the events related to this request And if i try to be more explicit in my request like below, I have no results   | tstats count from datamodel=TEST where EventCode=100    So what is the problem? Other question : what is the interest to use datamodel and pivot command since it's possible to query on a datamodel without SPL? Thanks
Hi @NeAllen , to debug your search I need some sample of your logs to check the regexes. Then I see a strange thing. when using head you shuld sort to that the most relevant events and not only th... See more...
Hi @NeAllen , to debug your search I need some sample of your logs to check the regexes. Then I see a strange thing. when using head you shuld sort to that the most relevant events and not only then events. So osrting e.g. for the sum of download_bytes and upload_bytes, you coulr run something like this: index=o365 sourcetype=* src_ip="141.*" | rex field=_raw "download:(?<download_bytes>\d+)" | rex field=_raw "upload:(?<upload_bytes>\d+)" | eval total_bytes=download_bytes+upload_bytes | sort 10 -total_bytes | table UserId total_bytes download_bytes upload_bytes Ciao. Giuseppe
Hi @RSS_STT, I cannot debug your fields extraction without accessing your system, but you could use a regex: | rex "\"CI\":\s+\"(?<CI_V2>[^;]*);(?<CI_1>[^;]*);(?<CI_2>[^;]*);(?<CI_3>[^;]*);(?<CI_4>... See more...
Hi @RSS_STT, I cannot debug your fields extraction without accessing your system, but you could use a regex: | rex "\"CI\":\s+\"(?<CI_V2>[^;]*);(?<CI_1>[^;]*);(?<CI_2>[^;]*);(?<CI_3>[^;]*);(?<CI_4>[^;]*);(?<CI_5>[^\"]*)" or  | rex field=CI "(?<CI_V2>[^;]*);(?<CI_1>[^;]*);(?<CI_2>[^;]*);(?<CI_3>[^;]*);(?<CI_4>[^;]*);(?<CI_5>[^\"]*)" that you can test at https://regex101.com/r/fndJqR/1 Ciao. Giuseppe  
Hi @mukhan1, at first read: https://docs.splunk.com/Documentation/Splunk/latest/Data/Monitorfilesanddirectories  https://lantern.splunk.com/Splunk_Platform/Getting_Started/Getting_data_into_Enterp... See more...
Hi @mukhan1, at first read: https://docs.splunk.com/Documentation/Splunk/latest/Data/Monitorfilesanddirectories  https://lantern.splunk.com/Splunk_Platform/Getting_Started/Getting_data_into_Enterprise  then check if the connection between the Forwarder and Splunk is open running a simple search on Splunk: index=_internal host=<your_forwarder_host> if you have events the connectin is established, if not you have primarly to configure the connection. If the connectin is ok, then, you should have in $SPLUNK_HOME/etc/system/local and inputs.conf file. In this file you should have a stanza that starts with [monitor://yourfile] take the path you have after monitor:// and run ls -la your path to see if your monitor stanza really reache the file to monitor. The issue could be have that the path isn't correct or that the user you're using to run Splunk hasn't the grants on that folder. Manually modify the inputs.conf stanza and restart Splunk on the Forwarder. Ciao. Giuseppe
Hello, I want to copy my custom App, which includes a dashboard created in DashboardStudio, to another Splunk server. I have imported numerous images into DashboardStudio, and I would like to copy... See more...
Hello, I want to copy my custom App, which includes a dashboard created in DashboardStudio, to another Splunk server. I have imported numerous images into DashboardStudio, and I would like to copy those images (including the associated kv-store data). Please let me know if there is a method to do this, such as copying files or using APIs. (By the way, the source server is configured as a search head cluster.)
Hello Team, I have a .log flat file this file give us the data whenever we open and run command it give us some logs, now i am integrating this .log file with Splunk but it is not integrating. I r... See more...
Hello Team, I have a .log flat file this file give us the data whenever we open and run command it give us some logs, now i am integrating this .log file with Splunk but it is not integrating. I ran following command to integrate it, "/splunk/bin ---> ./splunk add monitor [file name]" it give me message that file has been added to monitor list.  However i don't see this file on my Splunk, further if i have this file on Splunk how it will takes data from it whenever we run any command, also this .log file doesn't store data in any other directory whenever we close the file data disappears. Please note the OS im using is Sun Solaris 
Hello Splunkers! I was wondering where I can turn on and view the MITRE ATT&CK posture for every notable in Enterprise Security as shown in the picture:
Hi @NeAllen .. you may need only one rex command (with two matchings inside that single rex). the sample logs are needed, then only we can troubleshoot why the rex is not working as expected. thanks... See more...
Hi @NeAllen .. you may need only one rex command (with two matchings inside that single rex). the sample logs are needed, then only we can troubleshoot why the rex is not working as expected. thanks. 
Hi Team, I have downloaded the Splunk for Salesforce installation file but I have not installed it. can some one will helps us on this issue? And I have created connected app in Salesforce to connec... See more...
Hi Team, I have downloaded the Splunk for Salesforce installation file but I have not installed it. can some one will helps us on this issue? And I have created connected app in Salesforce to connect to Splunk and i have to implement and test the feature one of the Salesforce feature.  Best Regards Siva
Anyone figure out how to use Splunk SOAR IMAP app to connect to exchange mailbox ? The goal is to read new email coming in to the mailbox.