If everything else works OK (other logs are ingested properly), it seems to be a local permissions problem. You can try to check the _internal events from this forwarder but I don't remember if the e...
See more...
If everything else works OK (other logs are ingested properly), it seems to be a local permissions problem. You can try to check the _internal events from this forwarder but I don't remember if the eventlog access problems show up in the logs if you don't raise debugging levels.
This is one of the approaches. Another one would be to list all data and categorize it, then summarize and pick only matching ones. So in your case you probably can do something like <your_search> ...
See more...
This is one of the approaches. Another one would be to list all data and categorize it, then summarize and pick only matching ones. So in your case you probably can do something like <your_search> earliest=-30d to list all events and | eval state=if(_time<now()-86400,"old","new") to categorize it. But this approach will work only because you have a single "type of search" and only the time differs so the events are easily distinguishable. In more complicated case you can use another approach: <your search> earliest=-30d latest=-24h | eval state="old" | append [ <your search> earliest=-24h | eval state="new" ] Of course this one has limitations from the append command so you might use multisearch instead. Anyway. As you now have your search results, you can stats them | stats values(state) by answer so you know whether each answer is included in the old or new set. Now all that's left is to filter the result to only see those you want. For example if you want only those that are in the "new" period, but not in the "old" one you simply do | where state="new" AND NOT state="old" One caveat - matching multivalued fields can be a bit unintuitive since a condition is matched on each value from the mvfiled separately so | where state="new" AND state!="old" is a completely different condition (and I'll leave it as an exercise for the reader to find out what it matches).
Again, as Rich said - all data is searchable as long as it is hot, warm or cold. When it's rolled into frozen, it's either deleted (by default) or moved "out of" your Splunk installation and can be t...
See more...
Again, as Rich said - all data is searchable as long as it is hot, warm or cold. When it's rolled into frozen, it's either deleted (by default) or moved "out of" your Splunk installation and can be treated as "archived" because it can't be used immediately, needs to be thawed in order to be searchable again. See https://docs.splunk.com/Documentation/Splunk/9.2.0/Indexer/HowSplunkstoresindexes As soon as the bucket is frozen (assuming it's not deleted, but copied out to the frozen path or using your own script), it's not managed by Splunk anymore so it's up to you to manage that frozen data and make sure it's kept for another year and deleted after that period.
The "\\" sequence is a double escape. It is used because the regular expression is provided here as a string parameter to a command. In SPL strings can contain some special characters which can be ...
See more...
The "\\" sequence is a double escape. It is used because the regular expression is provided here as a string parameter to a command. In SPL strings can contain some special characters which can be escaped with backslash. Since backslash is used to escape other characters, it needs to be escaped itself. So if you type in "\\", it effectively becomes a string consisting of a single backslash. Therefore you have to be careful when testing regexes using rex command and later moving those regexes to config files as props/transforms since in props/transforms you usually don't have to escape the regexes. (unless you put them as string arguments for functions called using INGEST_EVAL). So - to sum up - your (?<char>(?=\\S)\\X) put as a string argument will be effectively a (?<char>(?=\S)\X) regex when unescaped and called as a regex. And this one you can of course test on regex101.com
Why specifically in CSV format? You can bring appdynamics data into splunk with the splunk addon for appdynamics. And add the itsi content pack for APM to give you kpis and services Details here:...
See more...
Why specifically in CSV format? You can bring appdynamics data into splunk with the splunk addon for appdynamics. And add the itsi content pack for APM to give you kpis and services Details here: https://docs.splunk.com/Documentation/CPAPM/1.1.0/CP/About
Hi Everyone,
I am looking for a little advice, I am currently searching splunk against multiple sets of variables to see if there are any events in the past 90 days, however I am running into an i...
See more...
Hi Everyone,
I am looking for a little advice, I am currently searching splunk against multiple sets of variables to see if there are any events in the past 90 days, however I am running into an issue with there being too many events that my search is parsing through. I dont need to see the total number of events that matched, only need to see if there were at least 10 events that matched. Since there are 100+ sets of variables to check, doing it by hand one at a time seems tedious and lengthy. Would you be able to help me limit the events parsed so that it stops checking a set once it reaches a predetermined amount?
Here is an example of my search:
index=blah sourcetype=blah (name=Name1 ip=IP1 id=id1) OR (name=Name2 ip=IP2 id=id2) OR (name=Name3 ip=IP3 id=id3) OR .... (name=Name105 ip=IP105 id=id105) | stats count by name, ip, id
Any and all help would be appreciated
Hi , I want to connect live data of various applications from Appdynamics to splunk itsi in csv format how to achieve this . Can anyone help me.It will be greatful if some guidance i get from this ...
See more...
Hi , I want to connect live data of various applications from Appdynamics to splunk itsi in csv format how to achieve this . Can anyone help me.It will be greatful if some guidance i get from this community. Thanks and Regards, Abhigyan.
Hi @twanie , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Poin...
See more...
Hi @twanie , good for you, see next time! let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hi @Arthur_Kwan, it isn't so easy to do but anyway, you could see in the Splunk Dashboard Exemples app,(https://splunkbase.splunk.com/app/1603) where in the Null Search Swapper example, you can see ...
See more...
Hi @Arthur_Kwan, it isn't so easy to do but anyway, you could see in the Splunk Dashboard Exemples app,(https://splunkbase.splunk.com/app/1603) where in the Null Search Swapper example, you can see how to display/hide a panel based on a search result, then in the In-page drilldown, you can find how to set a token to use in the same page and how to display it. So mixing these two samples, you should be able to do yor requirement. Ciao. Giuseppe
Thank you so much @richgalloway for your prompty response. Yes it is a "Splunk Enterprise on premises" and the retention is for "two years: - 1 year being searchable and - Another 1 year being ...
See more...
Thank you so much @richgalloway for your prompty response. Yes it is a "Splunk Enterprise on premises" and the retention is for "two years: - 1 year being searchable and - Another 1 year being archived, all makes 2 years retention. Q1-How these configuration looks like ? Q2- is there any documentation talking about this specifically to "Splunk Enterprise"? Thank you in advance.
When two apps set define values for the same sourcetype, the apps are applied in lexicographical order. If you want your app to have precedence, give it a name that comes before the official app.
I have a table and a couple of panels on my dashboard. I would like to click a table row and display/hide certain panels depending on the value of a specific column. name gender age Alice f...
See more...
I have a table and a couple of panels on my dashboard. I would like to click a table row and display/hide certain panels depending on the value of a specific column. name gender age Alice female 18 Bob male 22 For instance, I have the above table. I would like to display panel A and hide panel B when I click a row with gender=female, and display panel B and hide panel A when I click a row with gender=male. Let's say panel A depends on token panelA and panel B depends on token panelB. How should I do that? I am thinking about doing that in the drilldown setting but I do not know how to set or unset with a condition.
It looks like if you get any results in answer, they will be new - you could test this by shortening your subsearch to earliest=-25h latest=-24h which should show new addresses if they occur in the l...
See more...
It looks like if you get any results in answer, they will be new - you could test this by shortening your subsearch to earliest=-25h latest=-24h which should show new addresses if they occur in the last 24h but not in the hour before that
Splunk sirs, I am trying to add a boolean column to my data called 'new_IP_detected' which will tell me whether an answer IP is new compared to answer IPs from a previous time range. Both searche...
See more...
Splunk sirs, I am trying to add a boolean column to my data called 'new_IP_detected' which will tell me whether an answer IP is new compared to answer IPs from a previous time range. Both searches are from the same index and sourcetype, and I only want to compare whether or not an answer IP from -24h to now is in the list of answer IPs from -30d to -24h. My search so far: index=[sample index] sourcetype=[sample sourcetype] earliest=-24h latest=now NOT [ search index=[sample index] sourcetype=[sample sourcetype] earliest=-30d latest=-24h | stats count by answer | table answer] | stats count by answer | table answer As of right now I am getting no results which I believe is expected (meaning there are no new IPs in the last 24 hrs). How would I add 'new_IP_detected' column over the last 30 days?
Hi, So I’m working on creating an alert in Splunk, but I’m having some issues with setting up the query. The goal of the alert is to trigger when a shared drive or folder in Google Drive has been sh...
See more...
Hi, So I’m working on creating an alert in Splunk, but I’m having some issues with setting up the query. The goal of the alert is to trigger when a shared drive or folder in Google Drive has been shared externally for longer than a set period of time. I’ve seen some mentions of using the poolPeriod and fschange functions, but those seem to be better suited for system directories rather than Google Drive. Any advice on how to start setting up this query?
Hello,
I am trying to count how many days out of the last 12 months our users logged into two of our servers. And in the end I want it to display the days out of the 12 months the users logged...
See more...
Hello,
I am trying to count how many days out of the last 12 months our users logged into two of our servers. And in the end I want it to display the days out of the 12 months the users logged in. SO if a user logged in 4 time in one day it should count it as 1 day.
I have tried the "timechart span=1d count by Account_Name" this looked promising but timechart groups Account_names in OTHER field that is misleading because there are other accounts in that field.
index=windows source="WinEventLog:Security" EventCode=4624 host IN (Server1, Server2) Logon_Type IN (10, 7)
| eval Account_Name = mvindex(Account_Name,1)
| timechart span=1d count by Account_Name
| untable _time Account_Name count