All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi Anand, Can you please give us a little bit of details about your environment? How many controllers are you using in your environment ? is it HA or a single environment? Thanks Cansel
How do i integrate my website hosted on AWS(ec2) with splunk?
Hi Raja, Do you have regular load on your application. ? Although this is a very common situation and the cause is mainly associated with the non-arrival of the load of the captured ITs, many diffe... See more...
Hi Raja, Do you have regular load on your application. ? Although this is a very common situation and the cause is mainly associated with the non-arrival of the load of the captured ITs, many different issues can cause this situation. If you wish, we can come together with a short session to find the root cause. Thanks Cansel
Genius! Works perfectly!
Hi Surya You can do it and much  more with Dexter https://developer.cisco.com/codeexchange/github/repo/Appdynamics/AppDynamics.DEXTER/ Thanks Cansel
Hi Pooja, Yes you can do it with a several method. Can you access / edit master page of your solution.? Thanks Cansel
| eval fidelity=if(source="source 1", 1, 2) | eventstats min(fidelity) as best by device | where fidelity == best
It states ImportError: libssl.so.1.0.0: cannot open shared object file: No such file or directory Try running sudo ldconfig Running ldconfig after installing or removing shared libraries ensures t... See more...
It states ImportError: libssl.so.1.0.0: cannot open shared object file: No such file or directory Try running sudo ldconfig Running ldconfig after installing or removing shared libraries ensures that the system's dynamic linker can find and load the libraries correctly. If that doesnt work check your permissions, check what changes if any were made on the splunk server running the TA, that may help, if that all fails then support call may be your option.
This should produce an equivalent token value:     <input token="name" type="multiselect"> <label>Name</label> <choice value="*">ALL</choice> <prefix>(</prefix> <suffix>)</suffix> <valuePrefix>nam... See more...
This should produce an equivalent token value:     <input token="name" type="multiselect"> <label>Name</label> <choice value="*">ALL</choice> <prefix>(</prefix> <suffix>)</suffix> <valuePrefix>name IN ("</valuePrefix> <valueSuffix>")</valueSuffix> <delimiter> OR </delimiter> <fieldForLabel>name</fieldForLabel> <fieldForValue>val</fieldForValue> <search> <query>index=my_index | dedup name | sort name | eval val = name+"\",\""+name+".*"</query> </search> </input>     This will produce token values like:     (name IN ("VALUE1","VALUE1.*") OR name IN ("VALUE2","VALUE2.*") ...)     Which are equivalent to     (name="VALUE1" OR name="VALUE1.*" OR name="VALUE2" OR name="VALUE2.*" ... )       EDIT:  Now that I think about it, you can make exactly that token value by doing this:   <input token="name" type="multiselect"> <label>Name</label> <choice value="*">ALL</choice> <prefix>(</prefix> <suffix>)</suffix> <valuePrefix></valuePrefix> <valueSuffix></valueSuffix> <delimiter> OR </delimiter> <fieldForLabel>name</fieldForLabel> <fieldForValue>val</fieldForValue> <search> <query>index=my_index | dedup name | sort name | eval val = "name=\""+name+"\" OR name=\""+name+".*\""</query> </search> </input>  
Hi Steve, Are you sure about your controller  version ? I'm using this platform since 2012 but never heard the version (4.10.x) that you mention at your previous post . Controller version before "... See more...
Hi Steve, Are you sure about your controller  version ? I'm using this platform since 2012 but never heard the version (4.10.x) that you mention at your previous post . Controller version before "Calendar Versioning" platform using; 4.2.x - 4.3.x - 4.4.x -4.5.x then with Calendar Versionin 20.x (sinsce 2020) So can you please share your exact controller version (may be with screenshot can be more helpfull) And also which version and framework of your core aPM agent that currently facing issue also Thanks Cansel
In Splunk, the webhook Alert action accepts a single endpoint value to which to send the webhook. If you create an alert, then you can view it in Settings->"Searches, Reports, and Alerts", click the ... See more...
In Splunk, the webhook Alert action accepts a single endpoint value to which to send the webhook. If you create an alert, then you can view it in Settings->"Searches, Reports, and Alerts", click the "Edit" dropdown, then click "Advanced Edit", then scroll down to the fields of "action.webhook". Here you can specify more settings for your webhook. As for sending a webhook for Akamai, do you have documentation describing what the webhook should look like? If I understand correctly, you would like Splunk to have an alert which sends a webhook to Akamai that contains an IP, from a field in the alert.
If you are using dashboard studio, then you should see a "Default value" field appear on the Configuration column when the text input (text filter) box is highlighted in edit mode.
Hi Maniish, What kind issue are you faceing of ? Can you please give a little bit detail of your issue Thanks Cansel
You would have to tell Splunk how to split the events. You can do this by setting the LINE_BREAKER field in a props.conf file in an app in your indexers. If you could post a sample of your event (wi... See more...
You would have to tell Splunk how to split the events. You can do this by setting the LINE_BREAKER field in a props.conf file in an app in your indexers. If you could post a sample of your event (with sensitive data removed) and a rough description of your splunk setup (single machine or distributed?), then it would be easier to give you more specific pointers.
It could any number of things. If this was working before - what changed is what you want to find out first and work out where the problem is. I guess if it was working before the props/transform... See more...
It could any number of things. If this was working before - what changed is what you want to find out first and work out where the problem is. I guess if it was working before the props/transforms has either change, overwritten, or removed. Has someone removed the props/transforms apps that those sourcetypes belong to from /opt/splunk/etc/apps. You can check by starting here to find out where those sourcetypes live: /opt/splunk/bin/splunk btool props list --debug /opt/splunk/bin/splunk btool transforms list --debug
Thanks! I am still learning Splunk and will modify my query to check for the events.
You normally need to find the events that show you the data, so these need to be logged first and then into Splunk, so check to see if the below events are there and search for those based on the use... See more...
You normally need to find the events that show you the data, so these need to be logged first and then into Splunk, so check to see if the below events are there and search for those based on the user. Search for eventid field - I cant remeber the exact name, but it should be there. The below events many help find the data you are looking for for others check on Google plenty there.  EventCode=4624: Successful user logon (interactive logon). EventCode=4625: Failed user logon attempt. EventCode=4648: Logon using explicit credentials (e.g., "Run As" or services).
I'm sure you know already that the Universal Forwarder just forwards data from files, from event logs, or from scripts. Some example scenarios where you would want a heavy forwarder include: * You ... See more...
I'm sure you know already that the Universal Forwarder just forwards data from files, from event logs, or from scripts. Some example scenarios where you would want a heavy forwarder include: * You are collecting logs using apps like DBConnect, Salesforce, HTTP modular input, etc. (These apps tend to be managed using the web interface, so a heavy forwarder is better) * You would like to perform parsing operations on data before it is indexed. E.g. you might want to send certain data to one indexer cluster and other data to another indexer cluster. * You would like to collect events using the HTTP Event Collector (HEC), but you don't want to expose the HEC interface of your indexers.
HF's are a full Splunk instance, the UF is like an agent. We mainly use the HF if we want to ingest data via a Technical Add-on that uses modular inputs using python etc, or do you want to forward d... See more...
HF's are a full Splunk instance, the UF is like an agent. We mainly use the HF if we want to ingest data via a Technical Add-on that uses modular inputs using python etc, or do you want to forward data, or parse / mask the data before it's sent to Splunk indexers (So these are some of its use cases for a HF) The UF can do some parsing for some common data formats but can also be used for forwarding, but mainly its used to collect logs. So, think about your use case, example, do you need to collect some logs of logs like AWS, so that would be better use a HF and forward the data. (You can use the SH but then you may need more resources) For the UF - are you just collecting logs, or you want to forward some data on.
Hello @richgalloway  getIndex should should return value admin_audit from the eval; search at the end should return the content/events  of the index admin_audit