All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I'm not sure if I understand your question properly. Are you asking how to find a timestamp which is not included in the data you have? Well, if it's not there you need to make sure it's exported fro... See more...
I'm not sure if I understand your question properly. Are you asking how to find a timestamp which is not included in the data you have? Well, if it's not there you need to make sure it's exported from the source somehow. It's more a Solarwinds question than a Splunk one.
I am creating a dashboard with Splunk to monitor offline assets in my environment with SolarWinds. I have the add-on and incorporate solarwinds:nodes and solarwinds:alerts into my query. I am running... See more...
I am creating a dashboard with Splunk to monitor offline assets in my environment with SolarWinds. I have the add-on and incorporate solarwinds:nodes and solarwinds:alerts into my query. I am running into an issue where I cant get the correct output for how long an asset has been down.  In SolarWinds you can see Trigger time in the Alert Status Overview. This shows the exact date and time the node went down.  I cannot find a field from the raw data between both sourcetypes that will give me that output. I want to use eval to show how much time has passed since the trigger. Does anyone know how to achieve this?     
Thanks I will give it a try 
Hi Every index has file which told last used bucket number. You should also update this to refer correct number on node where you have copied those buckets. Of course if you have copied whole indexes... See more...
Hi Every index has file which told last used bucket number. You should also update this to refer correct number on node where you have copied those buckets. Of course if you have copied whole indexes directory then you probably have copied also those files too. If you haven’t copy those them indexer could overwrite old buckets with new events. r. Ismo
Hi @joe06031990  you can find out locations where SSL config present following command helpfull to get locations other tha default from command promt navigate to splunk--->bin run following... See more...
Hi @joe06031990  you can find out locations where SSL config present following command helpfull to get locations other tha default from command promt navigate to splunk--->bin run following command splunk btool inputs list ssl --debug | grep -i local use findstr instead of grep in case of windows 
Hi, I can see the below error in the internal logs for a host  that is not bringing any logs in  Splunk error SSLOptions [17960 TcListener] - inputs. conf/[SSL]: could not read properties; we don’... See more...
Hi, I can see the below error in the internal logs for a host  that is not bringing any logs in  Splunk error SSLOptions [17960 TcListener] - inputs. conf/[SSL]: could not read properties; we don’t have ssl options in inputs.conf just wondered if there was any other locations to check on the universal forwarder as it works fine for other servers.
How did you solve it?
Hi @Khalid.Rehan, Thank you for updating the thread and letting us know. 
@gcusello that worked great, thank you. Do you also happen to know the best way to add the totals for each carrier like on line 5 and 9 on my example chart? Like appendpipe?
I have the Microsoft Teams Add-on for Splunk installed and setup the inputs for the webhook.  When I tried to curl the webhook using the internal ip and the port that I have it set to, I get a faile... See more...
I have the Microsoft Teams Add-on for Splunk installed and setup the inputs for the webhook.  When I tried to curl the webhook using the internal ip and the port that I have it set to, I get a failed to connect error. Possibly, part of the issue could be that I don't have the webhook set to a HTTPS. Unfortunately, I'm not sure how to make the webhook accessible to a HTTPS. This isn't something I typically do. I've tried looking up how to make a my webhook accessible, but I haven't had any luck or nothing that made clear sense to me.
Hi @belleke , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hi @YuliyaVassilyev , at first Splunk isn't Excel! anyway you could try something like this: <your_search> | eval col=Region."|".Director | bin span=1mon _time | chart count OVER col BY _time | re... See more...
Hi @YuliyaVassilyev , at first Splunk isn't Excel! anyway you could try something like this: <your_search> | eval col=Region."|".Director | bin span=1mon _time | chart count OVER col BY _time | rex field=col "^(?<Region>[^\|]+)\|(?<Director>.*)" | fields - col | table Region Director * | addcoltotals | addtotals  then to add partial totals. Ciao. Giuseppe
Hi there! I want to create a scorecard by Manager and Region counting my Orders over Month. So the chart would look something like:  I have all the fields: Region, Director, Month and Order_Numb... See more...
Hi there! I want to create a scorecard by Manager and Region counting my Orders over Month. So the chart would look something like:  I have all the fields: Region, Director, Month and Order_Number to make a count. Please let me know if you have an efficient way to do this in SPL. Thank you very much!    
I've solved the issue, thanks for your help! @richgalloway 
Hi @lukasmecir , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma ... See more...
Hi @lukasmecir , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Yep, good point, thank you.
Hi @lukasmecir , remember to copy indexes.conf on the new machines. Ciao. Giuseppe
Hi @belleke , install on the UF the Splunk_TA_Windows ( https://splunkbase.splunk.com/app/742 ), remembering that, by default all the inputs are disabled, so you have to create a new folder called "... See more...
Hi @belleke , install on the UF the Splunk_TA_Windows ( https://splunkbase.splunk.com/app/742 ), remembering that, by default all the inputs are disabled, so you have to create a new folder called "local" and copy the inputs.conf from the  default folder and modifying disabled=1 to disabled=0 for all the inputs you need. Then install, the above Add-On also on the Splunk Server. Ciao. Giuseppe
Hi, I tried my process: Clear install of new IDX Run new IDX for the first time Crate index on new IDX Stop the new IDX Stop the old all-in-one instance Copy (by rsync -a command) desired WAR... See more...
Hi, I tried my process: Clear install of new IDX Run new IDX for the first time Crate index on new IDX Stop the new IDX Stop the old all-in-one instance Copy (by rsync -a command) desired WARM buckets (db_... dirs) from the old instance to new IDX Delete copied buckets from old all-in-one instance Start both instances Add new IDX as search peer on the old instance Reconfigure outputs.conf on forwarders to add new ID Everything seems OK now, I let it running for some time and check again.
Thank you for hint, sounds interesting, I will try. Redundancy is not desired in this case, so its no problem.