All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Unfortunately I don't think that you have any other option. That was the reason why @gjanders did this app...
Hi @Taruchit , in addition, you could use a lookup with kv-store so you'll have a key that guarantees the unicity of data. Only one question: you're trying to use Splunk as a database and Splunk is... See more...
Hi @Taruchit , in addition, you could use a lookup with kv-store so you'll have a key that guarantees the unicity of data. Only one question: you're trying to use Splunk as a database and Splunk isn't a database, are you sure that you're using the best solution for your requirements? Ciao. Giuseppe
i will give a try thanks for the helps
The lookup file is written by the search head in one go.  There is no parallelism.
@Skins hi) I faced the same limitation - the inability to use Ingest-time lookup on hw, did you manage to solve this issue?
If your problem is resolved, then please click an "Accept as Solution" button to help future readers.
Hello again @gcusello ! Sorry again. I want to return id, nr_of_days (difference between last_date and_first_date), login of last_date (could be today, yesterday, 1 month ago etc.) and login of first... See more...
Hello again @gcusello ! Sorry again. I want to return id, nr_of_days (difference between last_date and_first_date), login of last_date (could be today, yesterday, 1 month ago etc.) and login of first_date (where first_date is 365 days or more). 
I looked at the events for the component you mentioned and found that there is only one type of log entry. I also tried it for the "last 7 days" time range.  
How can you say your script is executing fine if it is not doing what you expect? Try running the search without the collect command as see what happens. Try running the search with different time ... See more...
How can you say your script is executing fine if it is not doing what you expect? Try running the search without the collect command as see what happens. Try running the search with different time chunks (if possible) to see what happens. Try running parts of the search to see whether data goes missing at any point. Try other ways to diagnose your issue as you haven't given us any useful information (so far) to help you determine what might be going on
Hi, I have my Splunk Dashboard created in Dashboard studio. The dashboard has 3 tables and all the values in this tables are either Left or Right aligned but I want them to be Center aligned. I... See more...
Hi, I have my Splunk Dashboard created in Dashboard studio. The dashboard has 3 tables and all the values in this tables are either Left or Right aligned but I want them to be Center aligned. I tried finding solutions, but all the solutions mentioned in other posts are for the Classic dashboards which are written in XML. How can we do this in JSON written Dashboard. Thanks, Viral
Hello All, I have a lookup file which stores data of hosts across multiple indexes.  I have reports which fetch information of hosts from each index and updates the records in lookup file. Can ... See more...
Hello All, I have a lookup file which stores data of hosts across multiple indexes.  I have reports which fetch information of hosts from each index and updates the records in lookup file. Can I run parallel search for hosts related to each index and thus parallelly update the same lookup file? Or is there any risk of performance, consistency of data? Thank you Taruchit
@ITWhisperer  My script is executing fine but filling no data for 5th august in summary index.  
Theres a talk here with some demos : https://community.splunk.com/t5/Splunk-Tech-Talks/Machine-Learning-Assisted-Adaptive-Thresholding/ba-p/676851
Your initial search might not be the best way to get what you're searching in the first place. Remember that Splunk's subsearches have their limits and might behave weirdly and give empty or wrong re... See more...
Your initial search might not be the best way to get what you're searching in the first place. Remember that Splunk's subsearches have their limits and might behave weirdly and give empty or wrong results if those limits are reached.  
+1 on that. Whenever possible, don't use SHOULD_LINEMERGE=true. It's a very expensive setting causing Splunk to try to re-merge already split events into bigger ones. While it has some use in very sp... See more...
+1 on that. Whenever possible, don't use SHOULD_LINEMERGE=true. It's a very expensive setting causing Splunk to try to re-merge already split events into bigger ones. While it has some use in very specific border cases as a rule of thumb you should avoid using it completely. That's what proper LINE_BREAKER is for.
OK. It's a start. So UDP packets are reaching your receiver box. What then? Are you trying to receive the data directly on your Splunk instance (or a forwarder)? Or are you using some intermediate sy... See more...
OK. It's a start. So UDP packets are reaching your receiver box. What then? Are you trying to receive the data directly on your Splunk instance (or a forwarder)? Or are you using some intermediate syslog receiver like rsyslog or syslog-ng. Did you check your local firewall? If it's a linux box, did you verify rp_filter settings and routing?
Do you have an example of this? I'm trying to work through it but I can't get anything to work. 
You can expect Splunk to use environmental variables only in the cases documented in conf file specs. So if you want to use a variable's value you need to resolve the variable yourself within the scr... See more...
You can expect Splunk to use environmental variables only in the cases documented in conf file specs. So if you want to use a variable's value you need to resolve the variable yourself within the script.
Okay, so it looks like you're getting a list of dictionaries returned, so it shouldn't be too hard for us to figure this out. I'm not an expert code writer, by any means, but I've messed around with ... See more...
Okay, so it looks like you're getting a list of dictionaries returned, so it shouldn't be too hard for us to figure this out. I'm not an expert code writer, by any means, but I've messed around with it enough to be able to troubleshoot at least, so I'll try to help. You should be able to do something like this where 'example' is the response from the previous call: for x in example: for k, v in x: if k == 'name': server_name = v url='https://su-ns-vpx-int-1.siteone.com/nitro/v1/config/lbvserver_servicegroup_binding/' + server_name <create API call here using new url>
Thanks @richgalloway Trying it out now. will let you know if it works.