All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@cbiraris  In Splunk, retention policies are set at the index level, not at the sourcetype level. This means that all sourcetypes within a single index (like your xyz index) will inherit the same re... See more...
@cbiraris  In Splunk, retention policies are set at the index level, not at the sourcetype level. This means that all sourcetypes within a single index (like your xyz index) will inherit the same retention period 4 months in your case. Unfortunately, there’s no native way to assign different retention periods to individual sourcetypes within the same index.
@PickleRick  I have executed the command but nothing is visible relevant to my required starnza. FYI to you my current inputs setting. [monitor://E:\var\log\Bapto\BaptoEventsLog\SZC\00000... See more...
@PickleRick  I have executed the command but nothing is visible relevant to my required starnza. FYI to you my current inputs setting. [monitor://E:\var\log\Bapto\BaptoEventsLog\SZC\000000000*-*-SZC.VIT.BaptoEvents.*] whitelist = \.csv$ disabled = false index = Bapto initCrcLength = 256 sourcetype = SZC_BaptoEvent props.conf: [SZC_BaptoEvent] SHOULD_LINEMERGE = false #CHARSET = ISO-8859-1 TIME_FORMAT = %Y-%m-%d %H:%M:%S.%3N MAX_TIMESTAMP_LOOKAHEAD = 23 TRANSFORMS-drop_header = remove_csv_header TZ = UTC transforms.conf [remove_csv_header] REGEX = ^Timestamp;AlarmId;SenderType;SenderId;Severity;CreationTime;ComplexEventType;ExtraInfo DEST_KEY = queue FORMAT = nullQueue Sample of csv files to be monitor: Timestamp;AlarmId;SenderType;SenderId;Severity;CreationTime;ComplexEventType;ExtraInfo 2025-03-27 12:40:12.152;1526;Mpg;Shuttle_115;Information;2025-03-27 12:40:12.152;TetrisPlanningDelay;TetrisId: TetrisReservation_16_260544_bqixLeVr,ShuttleId: Shuttle_115,FirstDelaySection: A24.16,FirstSection: A8.16,LastSection: A24.16 2025-03-27 12:40:12.152;1526;Mpg;Shuttle_115;Unknown;2025-03-27 12:40:12.152;TetrisPlanningDelay; 2025-03-27 12:40:14.074;0;Shuttle;Shuttle_027;Unknown;2025-03-27 12:40:14.074;NoError; 2025-03-27 12:40:16.056;0;Shuttle;Shuttle_051;Unknown;2025-03-27 12:40:16.056;NoError; 2025-03-27 12:40:30.076;0;Shuttle;Shuttle_119;Unknown;2025-03-27 12:40:30.076;NoError;
As others already pointed out - no. So you've just hit one of the main reasons for splitting data into indexes. There are two main factors when deciding whether you want the data in single index or m... See more...
As others already pointed out - no. So you've just hit one of the main reasons for splitting data into indexes. There are two main factors when deciding whether you want the data in single index or multiple ones: 1) Data retention settings (and that's your case) 2) Access control Both of those work at index level. There are some other things which might come into play in some border cases (like not mixing high-volume and low-volume data in a single index) but you get much less often that deeply into  data architecture.
splunk list inputstatus splunk list monitor What do these two have to say? Since you're ingesting csv files which have fixed headers there's a fat chance crcs match and files are  ot ingested be... See more...
splunk list inputstatus splunk list monitor What do these two have to say? Since you're ingesting csv files which have fixed headers there's a fat chance crcs match and files are  ot ingested because are treated as already seen. Might want to increase initCrcLength (or fiddle with crcSalt but that's the last resort).
So, the "Company Code" problem is solved, but now you have another problem? Please share more specifics?
@gcusello Yes, I have tries but nothing works.
Hi @vsommer I have tried your suggested one but still no luck found. [monitor://E:\var\log\Bapto\BaptoEventsLog\SZC\] whitelist = \.csv$  
Hello, I want to configure alert when queue is full. We have Max Que depth and current queue depth metrics.  Problem is there are 100 queues, and each queue is having different max value so I can't... See more...
Hello, I want to configure alert when queue is full. We have Max Que depth and current queue depth metrics.  Problem is there are 100 queues, and each queue is having different max value so I can't use * for calculating %. I don't want 100 health rules. * Is not allowed in metric expression. Is there any way to setup such alert? AppDynamics   
result is coming but the ones with similar names are not coming . where in  dns field similar fields are not coming.
By default it's supposed to be simple mode. But (and that's a big but), AOB might default to XML (and might not even be able to do it differently). You can check it like this (an example from my hom... See more...
By default it's supposed to be simple mode. But (and that's a big but), AOB might default to XML (and might not even be able to do it differently). You can check it like this (an example from my home lab): # /opt/splunk/bin/splunk cmd python /opt/splunk/etc/apps/TA-api-test/test_input_1.py --scheme <scheme> <title>test_input_1</title> <description>Go to the add-on's configuration UI and configure modular inputs under the Inputs menu.</description> <use_external_validation>true</use_external_validation> <streaming_mode>xml</streaming_mode> <use_single_instance>false</use_single_instance> <endpoint> <args> <arg name="name"> <title>test_input_1 Data Input Name</title> </arg> <arg name="placeholder"> <title>placeholder</title> <required_on_create>0</required_on_create> <required_on_edit>0</required_on_edit> </arg> </args> </endpoint> </scheme> As you can see - it's XML mode. And I'm not sure you can change that. At least I didn't see any option in AOB to change that. You might be able to fiddle with the input definition in AOB to see if it can explicitly break the REST results into separate events.
Sorry, try with double quotes around "Company Code" in the values function | stats values("Company Code") as "Company Code" by timeval ip dns "Operation System" severity pluginname timeval Scan-Loca... See more...
Sorry, try with double quotes around "Company Code" in the values function | stats values("Company Code") as "Company Code" by timeval ip dns "Operation System" severity pluginname timeval Scan-Location is_solved blacklisted
After running the search the "Company Code " field is empty
| inputlookup lkp-all-findings | lookup lkp-findings-blacklist.csv blfinding as finding OUTPUTNEW blfinding | lookup lkp-asset-list-master "IP Adresse" as ip OUTPUTNEW Asset_Gruppe Scan-Company Scann... See more...
| inputlookup lkp-all-findings | lookup lkp-findings-blacklist.csv blfinding as finding OUTPUTNEW blfinding | lookup lkp-asset-list-master "IP Adresse" as ip OUTPUTNEW Asset_Gruppe Scan-Company Scanner Scan-Location Location "DNS Name" as dns_name Betriebssystem as "Operation System" | lookup lkp-GlobalIpRange.csv 3-Letter-Code as Location OUTPUTNEW "Company Code" | eval is_solved=if(lastchecked>lastfound OR lastchecked == 1,1,0),blacklisted=if(isnull(blfinding),0,1),timeval=strftime(lastchecked,"%Y-%m-%d") | fillnull value="NA" "Company Code", Scan-Location | search is_solved=0 blacklisted=0 Scan-Location="*" "Company Code"="*" severity="high" | stats values("Company Code") as "Company Code" by timeval ip dns "Operation System" severity pluginname timeval Scan-Location is_solved blacklisted | fields "Company Code" timeval ip dns "Operation System" severity pluginname timeval Scan-Location is_solved blacklisted | sort severity
Hi @uagraw01, you can also change your stanza to this: [monitor://E:\var\log\Bapto\BaptoEventsLog\SZC\] whitelist = \.csv$   Hope this helps you.
Your info token equates to a short code yet your search is converting the code to a friendlier term before you search, could this be why your search is not working? | eval "Info Transaction CI HUB"=... See more...
Your info token equates to a short code yet your search is converting the code to a friendlier term before you search, could this be why your search is not working? | eval "Info Transaction CI HUB"=case(AddtionalOrgnl == "O 123", "Normal Transaction", AddtionalOrgnl == "O 70", "Velocity Transaction", AddtionalOrgnl == "O 71", "Gambling RFI", AddtionalOrgnl == "O 72", "Gambling OFI", AddtionalOrgnl == "O 73", "DTTOT Transaction", true(), "Other" ) | rename EndtoendIdOrgnl as "End To End Id" | search "Info Transaction CI HUB"="$info$"
Oh yes , sorry I gave wrong search . This is the seach | inputlookup lkp-all-findings | lookup lkp-findings-blacklist.csv blfinding as finding OUTPUTNEW blfinding | lookup lkp-asset-list-master ... See more...
Oh yes , sorry I gave wrong search . This is the seach | inputlookup lkp-all-findings | lookup lkp-findings-blacklist.csv blfinding as finding OUTPUTNEW blfinding | lookup lkp-asset-list-master "IP Adresse" as ip OUTPUTNEW Asset_Gruppe Scan-Company Scanner Scan-Location Location "DNS Name" as dns_name Betriebssystem as "Operation System" | lookup lkp-GlobalIpRange.csv 3-Letter-Code as Location OUTPUTNEW "Company Code" | eval is_solved=if(lastchecked>lastfound OR lastchecked == 1,1,0),blacklisted=if(isnull(blfinding),0,1),timeval=strftime(lastchecked,"%Y-%m-%d") | fillnull value="NA" "Company Code", Scan-Location | search is_solved=0 blacklisted=0 Scan-Location="*" "Company Code"="*" severity="high" | fields "Company Code" timeval ip dns "Operation System" severity pluginname timeval Scan-Location is_solved blacklisted | sort severity
Hi @uagraw01 , what about using only* instead *,csv? then, did you tried with whitelist option instead of inserting the file in the input stanza? Ciao. Giuseppe
Dear Splunkers!! I am facing an issue with Splunk file monitoring configuration. When I define the complete absolute path in the inputs.conf file, Splunk successfully monitors the files. Below are... See more...
Dear Splunkers!! I am facing an issue with Splunk file monitoring configuration. When I define the complete absolute path in the inputs.conf file, Splunk successfully monitors the files. Below are two examples of working stanza configurations: Working Configurations: [monitor://E:\var\log\Bapto\BaptoEventsLog\SZC\0000000002783979-2025-03-27T07-39-33-128Z-SZC.VIT.BaptoEvents.50301.csv] [monitor://E:\var\log\Bapto\BaptoEventsLog\SZC\0000000002783446-2025-03-27T05-09-20-566Z-SZC.VIT.BaptoEvents.50296.csv] However, since more than 200 files are generated, specifying absolute paths for each file is not feasible. To automate this, I attempted to use a wildcard pattern in the stanza, as shown below: Non-Working Configuration: [monitor://E:\var\log\Bapto\BaptoEventsLog\SZC\*.csv] Unfortunately, this approach does not ingest any files into Splunk. I would appreciate your guidance on resolving this issue. Looking forward to your insights.
Hi Morelz, Any news / progress on this?
Hi @Leonardo1998  In order to index this as a lowercase field, we need to establish how its derived.  Checking the app's props/transforms there are a number of REGEX which extract "subscription_id"... See more...
Hi @Leonardo1998  In order to index this as a lowercase field, we need to establish how its derived.  Checking the app's props/transforms there are a number of REGEX which extract "subscription_id" from various fields. such as below, however like you mentioned - this are subscription_id not subscriptionId! [mscs_extract_subscription_id_and_resource_group] SOURCE_KEY = AzureResourceId REGEX = (?i:subscriptions)\/([^\/]+)(?:\/(?i:resourceGroups)\/([^\/]+))? FORMAT = subscription_id::$1 resource_group::$2 [mscs_extract_subscription_id_and_resource_group_from_id] SOURCE_KEY = id REGEX = (?i:subscriptions)\/([^\/]+)(?:\/(?i:resourceGroups)\/([^\/]+))? FORMAT = subscription_id::$1 resource_group::$2   However.. I did find this: [azure_data_share_extract_from_properties] SOURCE_KEY = properties REGEX = \"(\w+)\":\"({.*}|.*?)\" FORMAT = $1::$2 Which extracts keyvalue pairs from properties and I *think* subscriptionId and subscriptionid get extracted from, based on this: coalesce('subscriptionId', 'properties.subscriptionId', 'properties.subscriptionid', SUBSCRIPTIONS) It looks like the source data contains different cased fields...not ideal! Anyway - If you let me know the sourcetype you are looking at I can try and help put together an index-time props/transforms to index this...or...the other thing you might like to do is an eval field to coalesce them at search-time so you have a consistent value. You might actually find that "vendor_account" already does this, but if not you could do this: [yourSourcetype] EVAL-subscriptionId=COALESCE(subscriptionId,subscriptionid) However would need to check the order of execution for the EVAL - or just see if it works Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will