Given that per host there are 2 events logged, one indicating transition to active and one indicating transition to inactive. I cant figure out a query that can accurately do this per host given the...
See more...
Given that per host there are 2 events logged, one indicating transition to active and one indicating transition to inactive. I cant figure out a query that can accurately do this per host given the following stipulations. Given the first event within the query time range, it can be assumed the host was in the opposite state prior. Only calculate transitions between the 2 states, if there are multiple same events within transitions, calculate of the time of the first occuring. Include the latest condition up until the time the search is run.
Ok. Honestly, you lost me here. What does have filtering characters have to do with extracting fields? Either you filter characters and parse resulting event or parse out fields than filter each fiel...
See more...
Ok. Honestly, you lost me here. What does have filtering characters have to do with extracting fields? Either you filter characters and parse resulting event or parse out fields than filter each field on its own. Or do I miss something?
Because I need to create the new field for every entry. If I were to implement that, then it would simply not match that entry and move on, effectively ignoring it. So I'm trying to find a way to c...
See more...
Because I need to create the new field for every entry. If I were to implement that, then it would simply not match that entry and move on, effectively ignoring it. So I'm trying to find a way to clean this portion of the data such that I can capture every entry. Right now I'm only matching on entries that contain [\w\s] but I'm missing a bunch. True, [\X] would ignore less, but I'm trying to ignore none and capture all, but avoid problems by cleaning/manipulating the text before capturing it within the capture group.
I have a feeling that you're thinking in SQL and want to bring the same paradigm to Splunk. Try describing what data you have and what you want to get as a result. We'll see how to get there.
Hi @tushar.darekar,
I want to make sure I understand your question. You are asking what is the max number of Pages, iframes, Virtual Pages and Ajax calls you can exclude?
My current search that is working is - | from datamodel:Remote_Access_Authentication
| rex field=dest_nt_domain "^(?<dest_nt_domain>[^\.]+)"
| join dest_nt_domain [|inputlookup Domain | rename nam...
See more...
My current search that is working is - | from datamodel:Remote_Access_Authentication
| rex field=dest_nt_domain "^(?<dest_nt_domain>[^\.]+)"
| join dest_nt_domain [|inputlookup Domain | rename name AS dest_nt_domain | fields dest_nt_domain]
| table dest_nt_domain My problem is that this search only returns values that match. How can I change this to an evaluation? If the two items match "Domain Accout" if != "Non Domain Account" My input lookup only contains one item.
Generally speaking the 403 error is on the server side. You say you are trying to download a trial, and that you "Managed to register it. " I don't know what that latter means? If it's a trial, yo...
See more...
Generally speaking the 403 error is on the server side. You say you are trying to download a trial, and that you "Managed to register it. " I don't know what that latter means? If it's a trial, you download it, install it and it just works. They're *all* trials until you put in a real license key (or connect it to your license master).[1] You could try a) just logging into a different place in Splunk, like docs.splunk.com. I think once you are logged in there you can then go back to splunk.com and see if it all still works. b) Have you tried just creating a new account and seeing if you can download it then? I'm not aware of any actual process to this, you just "create new account" and fill in a few pieces of information then off to the downloads you go. Anyway, hope one of these gets you working again! [1] LOL, this has actually been a complaint of mine for a decade now. I don't want to download a trial and convert it, I want to feel like I'm downloading the actual paid-for version that $company owned. But it's just words, so 'whatever'.
Hi, We have a datamodel built against application data. All the tstats searches against the DM were running fine, including the ones using summariesonly=true. I was noticing some discrepancy betwee...
See more...
Hi, We have a datamodel built against application data. All the tstats searches against the DM were running fine, including the ones using summariesonly=true. I was noticing some discrepancy between data model and raw data when plotting timechart for the exact same time range. Checked on the Data Model and found _time field was not added. But after adding that and re-accelerating the data model, now i cant use summariesonly=true. No results are returned. I do get data back without summariesonly=true. What could have gone wrong here? UPDATE I am able to search using summariesonly=true (Maybe DM needed more time to regenerate) but now I see massive difference in counts between summariesonly=true. Vs false. Data with false closely matches the raw data stats. Before that _time change, even summariesonly=true was matching the counts precisely. I see the _time field is set to "required" in the model but I don't think that would be preventing certain events from going into summary. All events in raw data do have default _time field. Am I missing some key fact here on how summary calculation might have changed with addition of this _time field?
Well, tstats works with more than just time fields. The limitation is that it only works with fields that are created *at* the time the events are indexed. https://docs.splunk.com/Documentation/Spl...
See more...
Well, tstats works with more than just time fields. The limitation is that it only works with fields that are created *at* the time the events are indexed. https://docs.splunk.com/Documentation/Splunk/9.1.3/Indexer/Indextimeversussearchtime I honestly think some of that information about performance or whatever is outdated, but most of that's all still fine documentation. Index time fields are those created when the data's indexed. By default it's just the built-in fields, like _time, sourcetype, and so on. In some cases it's all the fields, for instance with INDEXED_EXTRACTIONS=<json/csv/whatever>. But otherwise Splunk generally relies on search time fields - fields that are built "on the fly" as you run your search. It's more flexible and doesn't go 'out of date' as events change or your needed fields change around. The docs above should have links off them to explain more, but that's the gist of it.
WORKED! And this is my final query. TY `notable_by_id("*")` | search status_end="false" | where severity IN ("high", "critical") | eval timenow=now() | eval nowstring=strftime(now(), "%H:%M:%S %...
See more...
WORKED! And this is my final query. TY `notable_by_id("*")` | search status_end="false" | where severity IN ("high", "critical") | eval timenow=now() | eval nowstring=strftime(now(), "%H:%M:%S %p") | eval diff=now()-_time | eval diff=tostring(diff, "duration") | table _time, diff, rule_name, owner, status_label, "Audit Category", urgency | rename status_label as Status | rename diff as time_difference
Hi @LinghGroove, no problems: you can copy the license file from the old License Master to the new one. Obviously removing the old one after copy, to avoid to have two installations with the same l...
See more...
Hi @LinghGroove, no problems: you can copy the license file from the old License Master to the new one. Obviously removing the old one after copy, to avoid to have two installations with the same license. Ciao. Giuseppe
Hello all, I am managing a splunk architecture with an enterprise license. Sometime during this year i will need to do an architecture migration from my current architecture to a new one eliminating...
See more...
Hello all, I am managing a splunk architecture with an enterprise license. Sometime during this year i will need to do an architecture migration from my current architecture to a new one eliminating the old one. Will i be able to just copy the license file in the license manager of the new architecture? Is there some contractual problem with this procedure? thanks a lot.
@cedricamouyal - You are in the right direction, use that document to configure the proxy in server.conf and Splunk will use that proxy for every request originated by Splunk including license. I...
See more...
@cedricamouyal - You are in the right direction, use that document to configure the proxy in server.conf and Splunk will use that proxy for every request originated by Splunk including license. I hope this helps!! Please upvote if it does!!!
@raghul725 - This line below can do whatever addcoltotals can do: | appendpipe [| stats sum(File_Count) by Total_Delivered] I used Total_Delivered instead of "Total Delivered", as space sometime...
See more...
@raghul725 - This line below can do whatever addcoltotals can do: | appendpipe [| stats sum(File_Count) by Total_Delivered] I used Total_Delivered instead of "Total Delivered", as space sometimes create issues with tokens. I hope this helps!!