Activity Feed
- Posted How to sum all the Latest events for the specific field on Splunk Search. 07-08-2020 04:36 AM
- Posted Re: How to parse/index only json entry from raw data which are in non-uniform pattern? on Getting Data In. 01-26-2020 11:56 PM
- Posted Re: How to parse/index only json entry from raw data which are in non-uniform pattern? on Getting Data In. 01-26-2020 11:17 PM
- Posted How to parse/index only json entry from raw data which are in non-uniform pattern? on Getting Data In. 01-26-2020 11:10 PM
- Tagged How to parse/index only json entry from raw data which are in non-uniform pattern? on Getting Data In. 01-26-2020 11:10 PM
- Tagged How to parse/index only json entry from raw data which are in non-uniform pattern? on Getting Data In. 01-26-2020 11:10 PM
- Posted Re: regex delimiters & config file formatting on Splunk Search. 04-26-2019 12:13 AM
- Posted File monitoring in inputs.conf on Getting Data In. 04-12-2019 08:47 AM
- Tagged File monitoring in inputs.conf on Getting Data In. 04-12-2019 08:47 AM
- Tagged File monitoring in inputs.conf on Getting Data In. 04-12-2019 08:47 AM
- Tagged File monitoring in inputs.conf on Getting Data In. 04-12-2019 08:47 AM
- Posted Re: Is there any way to decode an encoded html values saved in a log file? on Splunk Search. 04-08-2019 07:20 AM
- Posted Re: Is there any way to decode an encoded html values saved in a log file? on Splunk Search. 04-08-2019 06:50 AM
- Posted Re: Is there any way to decode an encoded html values saved in a log file? on Splunk Search. 04-08-2019 02:58 AM
- Posted Is there any way to decode an encoded html values saved in a log file? on Splunk Search. 04-05-2019 02:41 AM
- Tagged Is there any way to decode an encoded html values saved in a log file? on Splunk Search. 04-05-2019 02:41 AM
- Tagged Is there any way to decode an encoded html values saved in a log file? on Splunk Search. 04-05-2019 02:41 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 |
07-08-2020
04:36 AM
How to sum all the Latest events for the specific field Example: Raw data of the event: Client=XXXXX,CreationTime=3/19/2020 9:09:36 AM,Version=08_07,NumberOfRequests=1,LastRequestTime=3/19/2020 9:09:36 AM,InactiveTimeSpan=0.7 minutes
Client=XXXXX,CreationTime=3/19/2020 9:08:50 AM,Version=08_07,NumberOfRequests=46,LastRequestTime=3/19/2020 9:10:17 AM,InactiveTimeSpan=0.0 minutes
Client=XXXXX,CreationTime=3/19/2020 9:09:56 AM,Version=08_07,NumberOfRequests=2,LastRequestTime=3/19/2020 9:10:13 AM,InactiveTimeSpan=0.1 minutes
Splunk Query Used:
index=mds sourcetype=logs host =xxx AND NumberOfRequests | rex field=_raw max_match=0 ",NumberOfRequests=(?P<my_requests>\d+),"| mvexpand my_requests | stats sparkline(sum(my_requests)) as Trend sum(my_requests) as Total, avg(my_requests) as Avg, max(my_requests) as Peak, latest(NumberOfRequests) as Current , latest(_time) as "Last Updated" by host | convert ctime("Last Updated") As provided in the example there are 3 numberofrequests present in single event, lets say same kind of events with different values in numberofrequests I want to a field which have sum of numberofrequests of latest event Please suggest
... View more
01-26-2020
11:56 PM
Is there anyway to make this possible through configuration changes while parsing/indexing the log file itself
... View more
01-26-2020
11:17 PM
From the below raw data only json need to be extracted/indexed in the splunk and should be viewed as json structured view while searching this logs on search head
<BOR>
ExSrc:Schwab.Client.Fx^
URL:null^
LogMsg:{"actor":{"Cust":null,"Acct":null,"Rep":null,"System":null},"header":{"AppId":null,"RecId":"null","Ver":"","StartTS":"null"},"source":{"Ip":"*","MacAddress":null,"SRCOS":"null","SRCRuntime":null,"SRCAppName":null,"SRCAppVersion":null,"SRCReqId":"null","CorrelationId":"null","SourceId":null,"Uri":"null"}}^
ExType:Common.Exceptions.ServiceCommunicationException^
<EOR>
... View more
01-26-2020
11:10 PM
How to parse/index only json entry from raw data which are in non-uniform pattern?
... View more
04-12-2019
08:47 AM
I want to configure an file in a directory which will be rolling over to new file within 2mins.
I tried basic inputs.conf as below, its working fine but its missing files which was rolled in to new For example, test.log is the file I want to continuously monitor, this test.log will be renamed as test-1.log within 2 mins and new datas will be written in test.log. My config is monitoring test.log once and after 6mins only test.log is again reading i.e., in between test-2.log created in 4th min and test-3.log in 6th min is ignored. I want to configure to monitor only test.log without any loss of data on it.
Note: logs are placed in *nix systems
inputs.conf used:
[monitor:///opt/sample/logs/test*.log]
index = test
disabled = false
sourcetype = test_logs
blacklist = (test*-\d{1,2}\.log$)
ignoreOlderThan = 30d
crcSalt = <SOURCE>
... View more
04-08-2019
07:20 AM
@pkeenan87, urldecode function is not working as expected. I tried doing that but that is working only for decoding values of url addresses not for an string containing ASCII encoded values in html.
... View more
04-08-2019
06:50 AM
@niketnilay, PFB sample of encoded html values in the log file.
%[datetime] [Default: 0] [] [INFO ] [*] - 
-- Start of getter Meods
retiremevice.io.ReturnCd: 0
retiremevice.io.RtSCd:
... View more
04-08-2019
02:58 AM
Hey Splunk folks,
Is there any possible way/ideas to do that?
... View more
04-05-2019
02:41 AM
I want decode all the encoded html values present in an log file while indexing itself.
Is there any way to do it ?
... View more