All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Will, I'll give you the karma for this. Weird how the TA on Splunkbase kicked me out to here.  Could be user error on my part. Thanks for your help.
Hi @Vin  Please could you share some of your raw events so that we can help you further? In the meantime, you might have some success with something like this? | rex field=_raw max_match=0 "(?<num... See more...
Hi @Vin  Please could you share some of your raw events so that we can help you further? In the meantime, you might have some success with something like this? | rex field=_raw max_match=0 "(?<numbers>\d+)" Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
Below is the search and I need to extract the ID's shown in the below event and there are also many other ID's. Please help me in writing a query to extract the ID's which starts with "Duplicate Id's... See more...
Below is the search and I need to extract the ID's shown in the below event and there are also many other ID's. Please help me in writing a query to extract the ID's which starts with "Duplicate Id's that needs to be displayed ::::::[6523409, 6529865]" in the log file.   index="*" source ="*"  "Duplicate Id's that needs to be displayed ::::::[6523409, 6529865]
I’m working on a Splunk search that needs to perform a lookup against a CSV file. The challenge is that some of the fields in the lookup table contain empty values, meaning an exact match doesn’t wor... See more...
I’m working on a Splunk search that needs to perform a lookup against a CSV file. The challenge is that some of the fields in the lookup table contain empty values, meaning an exact match doesn’t work. Here’s a simplified version of my search:   index="main" eventType="departure" | table _time commonField fieldA fieldB fieldC fieldD fieldE | lookup reference_data.csv commonField fieldA fieldB fieldC fieldD fieldE OUTPUTNEW offset   The lookup file reference_data.csv contains the fields: commonField , fieldA, fieldB, fieldC, fieldD, fieldE, lookupValue. Sometimes, fieldB, fieldC, or other fields in the lookup table are empty. fieldA always has a value, sometimes the same, but the value of the offset field changes based on the values of the other fields. If a lookup row has an empty value for fieldB, I still want it to match based on the available fields. What I've Tried: Using lookup normally, but it requires all fields to match exactly, which fails when lookup fields are empty. Creating multiple lookup commands for different field combinations, but this isn’t scalable. Desired Outcome: If commonField  matches, but fieldB is empty in the lookup file, I still want the lookup to return lookupValue. The lookup should prioritize rows with the most matching fields but still work even if some fields are missing.   Is there a way to perform a lookup in Splunk that allows matches even when some lookup fields are empty?
Is there anyone familiar with any guidance on fulfilling the logging requirements for CTO 24-003 with splunk queries and dashboard  
Found the issue: We built a standalone SH and copied the $SPLUNK_HOME/etc/apps directory from the SHC to it.  Started removing apps on the test server, one at a time, and when we removed one of the... See more...
Found the issue: We built a standalone SH and copied the $SPLUNK_HOME/etc/apps directory from the SHC to it.  Started removing apps on the test server, one at a time, and when we removed one of the Apps and restarted., the searches started to work again. One of our crew found the following in the app the was just removed: [source::stream:Gigamon] EVAL-_time = strptime('timestamp', "%Y-%m-%dT%H:%M:%S,%N") This seems to be the issue. We went back to the SHC and specified a source without removing anything and it pulled data. Not really clear on why that would make a difference, but it does. The main takeaway from this is that a configuration change that had an effect on _time caused this issue.
Hello, I'm to try changing the sourcetype at the indexer level based on the source.  First question is that possible on an indexer.   Second would it work with props.conf referencing the transforms... See more...
Hello, I'm to try changing the sourcetype at the indexer level based on the source.  First question is that possible on an indexer.   Second would it work with props.conf referencing the transforms   transforms.conf testchange REGEX = .+ FORMAT = Sourcetype::testsourcetype WRITE_META = true   thanks
Hey all, I am new to Splunk Enterprise and I would like to understand more about metrics and the use of metric indexes. So far, I have created my own metric index by going to Settings > Indexing. I ... See more...
Hey all, I am new to Splunk Enterprise and I would like to understand more about metrics and the use of metric indexes. So far, I have created my own metric index by going to Settings > Indexing. I have a bunch of Splunk Rules I have created and so far I have used the mcollect command to use the following: host= (ip address) source=(source name) | mcollect index=(my_metric_index) I am able to get a list of event logs showing on the Splunk dashboard , but I am not sure if the results showing on the Search and Reporting is being stored under my metric index. When I try to check under the Indexing Tab, my associated metric index is still at "0 MB" indicating no data  Is there anyway somone can help? Is it my index that needs work? Is it my search string query?    
Unfortunately, the log do not have the string "gets created in system"," gets modified..." or get whatever. The only information we see in the logs are  _time tradeNumber received _time tradeNumbe... See more...
Unfortunately, the log do not have the string "gets created in system"," gets modified..." or get whatever. The only information we see in the logs are  _time tradeNumber received _time tradeNumber sent _time tradeNumber received _time tradeNumber sent _time tradeNumber received _time tradeNumber sent
Hello, and I have another weird issue: When I execute a search on a SHC in the Search and Reporting App, getting data from 2025-02-27 index=test earliest=-7d@d latest=-6d@d I get zero events When... See more...
Hello, and I have another weird issue: When I execute a search on a SHC in the Search and Reporting App, getting data from 2025-02-27 index=test earliest=-7d@d latest=-6d@d I get zero events When I execute the search WITHOUT the earliest and latest time modifiers and use the Time Picker in the UI which results in "during Thu, Feb 27, 2025" I get around 167,153 results Specifying the time range with earliest and latest time modifiers is NOT giving me the "Your timerange was substituted based on your search string". If I use tstats, I get the correct number of events, the correct date, and the message "Your timerange was substituted based on your search string" is present | tstats count where index=test earliest=-7d@d latest=-6d@d by _time span=d I also made index=test earliest=-7d@d latest=-6d@d a saved search which executes every 10 minutes - zero events. Another bit of weirdness: If I run that search, and specify "All time", it will pull events ONLY for 2025-02-27. Nothing for other dates, and it has 12 months of events, populated for every day. So, it looks at both the time qualifiers and the time picker under that scenario. Any ideas what might be causing this? (I have several standalone searchheads that are working fine)
Hi @DPOIRE , you have to extract the correct delays and then use them as you like: <your_search> | stats earliest(eval(if(searchmatch("gets created in system"),_time,""))) AS gets_created_in_... See more...
Hi @DPOIRE , you have to extract the correct delays and then use them as you like: <your_search> | stats earliest(eval(if(searchmatch("gets created in system"),_time,""))) AS gets_created_in_system latest(eval(if(searchmatch("gets sent to market"),_time,""))) AS gets_sent_to_market earliest(eval(if(searchmatch("gets modified in system"),_time,""))) AS gets_modified_in_system latest(eval(if(searchmatch("gets sent to market with modification"),_time,""))) AS gets_sent_to_market_with_modification earliest(eval(if(searchmatch("gets cancelled in system"),_time,""))) AS gets_cancelled_in_system latest(eval(if(searchmatch("gets sent to market as cancelled"),_time,""))) AS gets_sent_to_market_as_cancelled BY TradeNumber In this way you'll have the epochtime of each event in the same row and you can calculate all the diffs you need. Ciao. Giuseppe
Hi @Priya70 , in this case the issue is in the verification algorithm not in the search! Apply the correct adapt for your data. Ciao. Giuseppe
Hi @Jailson , the timepicker works only on _time and not on a field like deletion_date. If you want to filter your data using this filter you have to add it to the main search. In addition after t... See more...
Hi @Jailson , the timepicker works only on _time and not on a field like deletion_date. If you want to filter your data using this filter you have to add it to the main search. In addition after the top command you have only the fields in the command, in your case: categoryId, perc, count. If you want to filter your data for deletion_date, you have to put this filter in the main search or before the top command, obviously, if you have this field in your data. The syntax depends on the format of yor deletion_date field, e.g. if it's in format "yyyy-mm-dd" and you want to filter results if deletion_date>2024-12-31, you should use something like this: sourcetype=access_* status=200 action=purchase | eval deletion_date_epoch=strptime(deletion_date,"%Y-%m-%d"), deletion_date_filter_epoch=strptime("2024-12-31","%Y-%m-%d") | where deletion_date_epoch>deletion_date_filter_epoch | top categoryId Ciao. Giuseppe
@DPOIRE  Simulates trade events using makeresults, assigns timestamps, and labels each step (New Order, Modification, Cancellation). Uses streamstats to track event sequence, capture previous timest... See more...
@DPOIRE  Simulates trade events using makeresults, assigns timestamps, and labels each step (New Order, Modification, Cancellation). Uses streamstats to track event sequence, capture previous timestamps, and calculate time delay for each step.    
@Jailson  What exactly are you looking for? Could you elaborate a bit more?
I have a survey that has a date field deletion_date. How can I filter this field by the Time range?     sourcetype=access_* status=200 action=purchase | top categoryId |where deletion_date > ... See more...
I have a survey that has a date field deletion_date. How can I filter this field by the Time range?     sourcetype=access_* status=200 action=purchase | top categoryId |where deletion_date > ?        
Hi, Here is a scenario: Step 1 9h30 TradeNumber 13400101 gets created in system 9h32 TradeNumber 13400101 gets sent to market Step 2 9h45 TradeNumber 13400101 gets modified in system 9h50 Tr... See more...
Hi, Here is a scenario: Step 1 9h30 TradeNumber 13400101 gets created in system 9h32 TradeNumber 13400101 gets sent to market Step 2 9h45 TradeNumber 13400101 gets modified in system 9h50 TradeNumber 13400101 gets sent to market with modification Step 3 9h55 TradeNumber 13400101 gets cancelled in system 9h56 TradeNumber 13400101 gets sent to market as cancelled I need to monitor the Delay for sending the order to market. In the above scenario we have 3 steps for the same TradeNumber and each needs to be calculated separately. Delay for sending new trade Delay for modifying Delay for cancelling The log does not allow to differenciate the steps but the sequence is always in the right order. If I use | stats range(_time) as Delay by TradeNumber | stats max(Delay) For TradeNumber 13400101, it will return 26mins I am looking to have a result of 5mins (gets modified ,9h45 to 9h55) Anyway Splunk can match by sequence (or something else) and TradeNumber to calculate 3 values for the same TradeNumber ?
Regular expressions don't handle negation well.  The given regex will match the sample event because the third character does not consist of "EventType".  It's probably better to index matching event... See more...
Regular expressions don't handle negation well.  The given regex will match the sample event because the third character does not consist of "EventType".  It's probably better to index matching events and discard the rest. [solarwinds:alerts] TRANSFORMS-t=keep-5000, delete-others [keep-5000] REGEX = ("EventType": 5000) DEST_KEY = queue FORMAT = indexQueue [delete-others] REGEX = . DEST_KEY = queue FORMAT = nullQueue
Hi @gcusello I can confirm that regex is correct bcz I see the app names when I display it on the table.  The problem is its not returning anything when there are devices with only uninstall event... See more...
Hi @gcusello I can confirm that regex is correct bcz I see the app names when I display it on the table.  The problem is its not returning anything when there are devices with only uninstall event, but no subsequent install event for the same application.  Also, not sure why I am keep on getting events for the same application being "removed successfully" everyday when there is no installation of the application later on.
I want to send the all the event to nullqueue except having match "EventType": 5000.   {"EventID": 2154635, "EventType": 5000, "NetObjectValue": null, "EngineID": null}   [solarwinds:alerts] TRA... See more...
I want to send the all the event to nullqueue except having match "EventType": 5000.   {"EventID": 2154635, "EventType": 5000, "NetObjectValue": null, "EngineID": null}   [solarwinds:alerts] TRANSFORMS-t=eliminate-except-5000   [eliminate-except-5000] REGEX=[\w\W]+[^("EventType": 500)] DEST_KEY=queue FORMAT=nullQueue