All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

If your problem is resolved, then please click an "Accept as Solution" button to help future readers.
Hello again @gcusello ! Sorry again. I want to return id, nr_of_days (difference between last_date and_first_date), login of last_date (could be today, yesterday, 1 month ago etc.) and login of first... See more...
Hello again @gcusello ! Sorry again. I want to return id, nr_of_days (difference between last_date and_first_date), login of last_date (could be today, yesterday, 1 month ago etc.) and login of first_date (where first_date is 365 days or more). 
I looked at the events for the component you mentioned and found that there is only one type of log entry. I also tried it for the "last 7 days" time range.  
How can you say your script is executing fine if it is not doing what you expect? Try running the search without the collect command as see what happens. Try running the search with different time ... See more...
How can you say your script is executing fine if it is not doing what you expect? Try running the search without the collect command as see what happens. Try running the search with different time chunks (if possible) to see what happens. Try running parts of the search to see whether data goes missing at any point. Try other ways to diagnose your issue as you haven't given us any useful information (so far) to help you determine what might be going on
Hi, I have my Splunk Dashboard created in Dashboard studio. The dashboard has 3 tables and all the values in this tables are either Left or Right aligned but I want them to be Center aligned. I... See more...
Hi, I have my Splunk Dashboard created in Dashboard studio. The dashboard has 3 tables and all the values in this tables are either Left or Right aligned but I want them to be Center aligned. I tried finding solutions, but all the solutions mentioned in other posts are for the Classic dashboards which are written in XML. How can we do this in JSON written Dashboard. Thanks, Viral
Hello All, I have a lookup file which stores data of hosts across multiple indexes.  I have reports which fetch information of hosts from each index and updates the records in lookup file. Can ... See more...
Hello All, I have a lookup file which stores data of hosts across multiple indexes.  I have reports which fetch information of hosts from each index and updates the records in lookup file. Can I run parallel search for hosts related to each index and thus parallelly update the same lookup file? Or is there any risk of performance, consistency of data? Thank you Taruchit
@ITWhisperer  My script is executing fine but filling no data for 5th august in summary index.  
Theres a talk here with some demos : https://community.splunk.com/t5/Splunk-Tech-Talks/Machine-Learning-Assisted-Adaptive-Thresholding/ba-p/676851
Your initial search might not be the best way to get what you're searching in the first place. Remember that Splunk's subsearches have their limits and might behave weirdly and give empty or wrong re... See more...
Your initial search might not be the best way to get what you're searching in the first place. Remember that Splunk's subsearches have their limits and might behave weirdly and give empty or wrong results if those limits are reached.  
+1 on that. Whenever possible, don't use SHOULD_LINEMERGE=true. It's a very expensive setting causing Splunk to try to re-merge already split events into bigger ones. While it has some use in very sp... See more...
+1 on that. Whenever possible, don't use SHOULD_LINEMERGE=true. It's a very expensive setting causing Splunk to try to re-merge already split events into bigger ones. While it has some use in very specific border cases as a rule of thumb you should avoid using it completely. That's what proper LINE_BREAKER is for.
OK. It's a start. So UDP packets are reaching your receiver box. What then? Are you trying to receive the data directly on your Splunk instance (or a forwarder)? Or are you using some intermediate sy... See more...
OK. It's a start. So UDP packets are reaching your receiver box. What then? Are you trying to receive the data directly on your Splunk instance (or a forwarder)? Or are you using some intermediate syslog receiver like rsyslog or syslog-ng. Did you check your local firewall? If it's a linux box, did you verify rp_filter settings and routing?
Do you have an example of this? I'm trying to work through it but I can't get anything to work. 
You can expect Splunk to use environmental variables only in the cases documented in conf file specs. So if you want to use a variable's value you need to resolve the variable yourself within the scr... See more...
You can expect Splunk to use environmental variables only in the cases documented in conf file specs. So if you want to use a variable's value you need to resolve the variable yourself within the script.
Okay, so it looks like you're getting a list of dictionaries returned, so it shouldn't be too hard for us to figure this out. I'm not an expert code writer, by any means, but I've messed around with ... See more...
Okay, so it looks like you're getting a list of dictionaries returned, so it shouldn't be too hard for us to figure this out. I'm not an expert code writer, by any means, but I've messed around with it enough to be able to troubleshoot at least, so I'll try to help. You should be able to do something like this where 'example' is the response from the previous call: for x in example: for k, v in x: if k == 'name': server_name = v url='https://su-ns-vpx-int-1.siteone.com/nitro/v1/config/lbvserver_servicegroup_binding/' + server_name <create API call here using new url>
Thanks @richgalloway Trying it out now. will let you know if it works.
Hi Splunkers The idea is to pull any new file creations on a particular folder inside C:\users\<username>\appdata\local\somefolder i wrote a batch script to pull and index this data. its working bu... See more...
Hi Splunkers The idea is to pull any new file creations on a particular folder inside C:\users\<username>\appdata\local\somefolder i wrote a batch script to pull and index this data. its working but the issue is i cannot define a token for users. eg: In script if i mention the path as C:\users\<user1>\appdata\local the batch script will run as expected an data will be indexed to splunk but if i mention the user1 as %userprofile% or %localappdata% the batch script is not running. How to resolve this    
Shooting out one Response since it applies to all. Thank you everyone for your guidance and knowledge! Always helps to have a strong community to back some of these questions so I appreciate it!
Try these settings [applog_test] DATETIME_CONFIG = LINE_BREAKER = ([\r\n]+)ERROR NO_BINARY_CHECK = true category = Custom disabled = false pulldown_type = true SHOULD_LINEMERGE = false TIME_FORMAT =... See more...
Try these settings [applog_test] DATETIME_CONFIG = LINE_BREAKER = ([\r\n]+)ERROR NO_BINARY_CHECK = true category = Custom disabled = false pulldown_type = true SHOULD_LINEMERGE = false TIME_FORMAT = %Y-%m-%d %H:%M:%S,%3N TIME_PREFIX = ERROR\s+ Don't specify BREAK_ONLY_BEFORE_DATE if you want to break at something other than a date.  Also, don't use both BREAK_ONLY_BEFORE_DATE and LINE_BREAKER in the same stanza.  When using LINE_BREAKER, set SHOULD_LINEMERGE to false.
@gcusello , I am looking for only forwarder and Indexer.
If your app is running in IIS, and you restarted both the agent and the IIS, it should work. Here are some questions 1. If you are running a machine agent on the same server, please ensure that i... See more...
If your app is running in IIS, and you restarted both the agent and the IIS, it should work. Here are some questions 1. If you are running a machine agent on the same server, please ensure that in the machine agent controller-info.xml you set dotnet compatibility mode to "true". 2. Ensure to generate some load on the IIS application, it will only register the tiers if there is load on them. Easiest way is to just open the browse option for whichever applications you have deployed and clicking through the app or refreshing the base page a number of times.