Activity Feed
- Got Karma for Re: Case insensitive search in rex. Thursday
- Posted Re: Single Value Drilldown not working in Dashboard Studio on Splunk Enterprise. 02-25-2025 02:18 PM
- Posted Single Value Drilldown not working in Dashboard Studio on Splunk Enterprise. 02-25-2025 01:19 PM
- Got Karma for Re: Error in 'litsearch' command: Your Splunk license expired or you have exceeded your license limit too many times.. 12-09-2024 12:51 PM
- Posted Splunk DB connect indexing only 10k events per hour on Getting Data In. 08-28-2024 12:38 PM
- Got Karma for Re: Why are deployment clients not showing up in deployment server?. 07-29-2024 07:50 AM
- Got Karma for Re: Receiving the following error: "failed to start splunk.service: unit splunk.service not found". 07-18-2024 10:51 AM
- Got Karma for Re: How do I set a query to run overnight without it expiring before it completes?. 07-11-2024 03:42 PM
- Got Karma for Re: I am not recieving the logs of my linux machine. 06-04-2024 06:52 PM
- Got Karma for Re: I am not recieving the logs of my linux machine. 06-04-2024 05:11 PM
- Got Karma for Re: I am not recieving the logs of my linux machine. 06-04-2024 05:11 PM
- Got Karma for Re: I am not recieving the logs of my linux machine. 06-04-2024 05:11 PM
- Got Karma for Re: How can I convert column value to column?. 04-16-2024 06:21 AM
- Got Karma for Re: What does "notracking@example.com" mean in Splunk Add-on for Microsoft Cloud Services?. 04-03-2024 07:35 PM
- Got Karma for Re: JSON format - Duplicate value in field. 04-03-2024 02:05 AM
- Got Karma for Re: How can I identify real time searches?. 03-15-2024 12:52 AM
- Got Karma for Re: stats vs eventstats. 03-01-2024 09:58 PM
- Got Karma for Re: Datamodel Acceleration questions. 02-13-2024 08:27 AM
- Got Karma for Re: How to overwrite the host field value with dvc field value ?. 10-04-2023 06:52 AM
- Got Karma for Re: what actually dnslookup doing in my query? and what is it?. 09-26-2023 06:18 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
1 | |||
0 |
04-06-2022
09:16 PM
For SPL try this: <your base search>
| extract kvdelim=":" pairdelim=","
... View more
04-06-2022
09:10 PM
1 Karma
Sorry you mentioned dd-mm-yyyy-hh:mm:ss this format. if your format is mm-dd-yyyy-hhmmss below should work for all the sources. | makeresults
| eval source="/admin/logs/abc/inventory/04-04-2022-101634-all-b5.xxx"
| eval _time=strptime(replace(source,".*(?=/)/",""),"%m-%d-%Y-%H%M%S")
... View more
04-06-2022
03:27 PM
Example File Name:/admin/logs/abc/inventory/04-04-2022-101634-all-b5.xxx File Name Format: %d-%m-%Y-%H%M%S-all-b5.xxx _time value assigned to events:2022-04-04 10:16:34 props.conf [mysourcetype]
TRANSFORMS=timestampeval transforms.conf [timestampeval]
INGEST_EVAL = _time=strptime(replace(source,".*(?=/)/",""),"%d-%m-%Y-%H%M%S") This takes the "source" metadata value (which is the path and file name), removes the path, then extracts the date from the filename. The time defaults to 00:00:00. All events in the file will have the same _time when imported. Run below anywhere search to test | makeresults
| eval source="/admin/logs/abc/inventory/04-04-2022-101634-all-b5.xxx"
| eval _time=strptime(replace(source,".*(?=/)/",""),"%d-%m-%Y-%H%M%S")
... View more
04-06-2022
02:16 PM
1 Karma
is this what you are looking for? <your search> | rex field=_raw "query=\"(?<query>.*)\"\s+params=\"(?<params>[^\"]+)" | table query params if not, then please give us some input examples and the expected output you are looking for.
... View more
04-06-2022
02:07 PM
Not sure if this is what you are looking for ? index=k8s_events namespace=ecom-middleware NOT method=OPTIONS response_code>200
| bin _time span=1d
| stats count by path _time
| streamstats window=5 sum(count) as total_count avg(count) as avgCount by path
| fields _time path total_count avgCount Say you run that search over the last 30 days, where each row is a unique day with path . And each row has a '_time' field, and an 'avgCount' field. The avgCount field will be the average events per day, during that day and the 4 days preceding it.
... View more
04-06-2022
01:14 PM
In your environment, you should try: <your search>
| rex field=<fieldname> "^(?<custom_fieldname>[^\.]+)" try this run anywhere search: | makeresults
| eval data="24611_abce.XXX.AAA.com,24612_r1q2e3.XXX.AAA.com,null,null,4iop45_q7w8e9.XXX.AAA.com,hki90lhf3_m1n2b3.QQQQ.AAA.com"
| makemv data delim=","
| mvexpand data
| rex field=data "^(?<data>[^\.]+)" let me know if this helps!
... View more
04-06-2022
08:50 AM
Hi you would need to forward audit logs from splunk UF to splunk indexers. https://community.splunk.com/t5/Getting-Data-In/Does-Splunk-Universal-Forwarder-forward-audit-events/m-p/403192
... View more
04-06-2022
08:14 AM
Could you please paste the result of the output that you want? The question doesn't make sense to me, when you say sum(expense) by ID, it means sum(expense) by distinct ID in splunk language and it will always be the value of expense as shown in the input table. Please tell us the output result in tabular format that you want.
... View more
04-06-2022
08:07 AM
I do not think you can change the path explicitly in SPL https://community.splunk.com/t5/Getting-Data-In/How-to-change-the-location-a-saved-search-outputs-a-CSV-file-to/m-p/204917 however, you can write cron jobs to move the file on OS level.
... View more
04-05-2022
02:10 PM
you would need to format the output <your search>
| table count index sourcetype time results
| eval _raw = mvzip(mvzip(mvzip(mvzip(count, index, " "), sourcetype, " "),time, " "),results, " ")
| outputtext usexml=false | rename _xml as raw | fields raw | fields - _* | outputcsv append=t results.txt
... View more
04-05-2022
01:55 PM
see if this helps ! https://community.splunk.com/t5/Splunk-Search/Export-results-to-txt/m-p/105583
... View more
04-01-2022
11:49 AM
try this: index=connections
| stats max(_time) as connection_time values(user) as user by ip_address
| join type=left ip_address
[ search index=events event_id=*
| rename ip as ip_address
| stats max(_time) as provoked_time values(event_id) as event_id by ip_address ]
| convert ctime(*_time)
... View more
04-01-2022
07:50 AM
you need to use this regex on search head go to Settings » Fields » Field extractions » Add new Destination App: <your_app>
Name: <name>
Apply to: choose sourcetype : named <your_sourcetype>
Type: Inline
Extraction/Transform: \d{2}:\d{2}:\d{2},(?<src_ip>(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?))\,(?<dst_ip>(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?))\,(?:[^:\n]*:){2}\d+,\d+,\d+,(?<src_port>\d+)\,(?<dst_port>[^\,]+)(?:[^,\n]*,){5}(?<action>[^\,]+) let me know if this helps!
... View more
03-31-2022
03:54 PM
1 Karma
Pls refer this: https://community.splunk.com/t5/Dashboards-Visualizations/How-can-I-get-the-Table-cell-colorization-rendering-for-every/td-p/289392?_ga=2.212990160.571977538.1648667563-1053383095.1646419186&_gl=1*uitn7u*_ga*MTA1MzM4MzA5NS4xNjQ2NDE5MTg2*_gid*NTcxOTc3NTM4LjE2NDg2Njc1NjM.
... View more
03-31-2022
03:13 PM
Try this: index=connections
| stats max(_time) as connection_time ip_address user
| join type=left ip_address
[ search index=events event_id=*
| rename ip as ip_address
| stats max(_time) as provoked_time values(event_id) as event_id by ip_address ]
| convert ctime(*_time) Upvote/Accept if it works.
... View more
03-31-2022
02:43 PM
Hi there are two ways to do this. 1st way : put the specific error in the main search and you will find the all the storenumber counts with that error. index=* host="log*" "store license for" "Error" | rex field=_raw "Store\s(?P<storenumber>.*)" | stats count by storenumber 2nd way: Extract the error from the raw data and display/filter in the statistics. index=* host="log*" "store license for" | rex field=_raw "Store\s*(?P<storenumber>\d+)\n*.*ERROR(?<Error>.*)" | stats count by storenumber Error let me know if this helps!
... View more
03-31-2022
11:11 AM
Hi please make sure "show only popular " box is unchecked in Settings > Source Types to get all the list of sourcetypes. also look for the sourcetypes on the server it is created so if you have create a sourcetype on the DB connect splunk server then look there to delete the specific sourcetype. Please refer to this document: https://docs.splunk.com/Documentation/Splunk/8.2.5/Data/Managesourcetypes#Delete_a_source_type
... View more
03-31-2022
10:55 AM
yes you can use this regex in props.conf. if you want to add a search time field extraction within props.conf, just use EXTRACT [your-sourcetype]
EXTRACT-<class> = [<regex>|<regex> in <src_field>]
* Used to create extracted fields (search-time field extractions) that do
not reference transforms.conf stanzas. for reference see : http://docs.splunk.com/Documentation/Splunk/8.2.5/Admin/Propsconf Please keep in mind that this will require a refresh/debug= http[s]://[splunkweb hostname]:[splunkweb port]/debug/refresh
... View more
03-31-2022
10:49 AM
1 Karma
Hey Not sure if there is any other easy way to do this but you can give this a try: <user search>
|eval tagged=mvzip(user,builtinadmin)
| mvexpand tagged
| makemv tagged delim=","
| eval user=mvindex(tagged,0)
| eval builtinadmin=mvindex(tagged,1)
| table dest user builtinadmin let me know if this helps!
... View more
03-31-2022
10:32 AM
1 Karma
Hello , if its a dynamic list of hosts you could create a lookup table for hosts using settings » Lookups » Lookup table files » New Table Lookup File. and use below search index=<your_index> [inputlookup hosts.csv | table host ] Select date range on the top right side of the search bar to get the appropriate results. let me know if this helps!
... View more
03-31-2022
10:13 AM
Hey could you please try this : | makeresults
| eval _raw="Mar 31 18:18:35 LUM-EVERE-PAFW-R8-17-T1 1,2022/03/31 18:18:35,015701001564,TRAFFIC,drop,2305,2022/03/31 18:18:35,10.81.13.68,34.240.162.53,0.0.0.0,0.0.0.0,prodedfl_access_1289,,,not-applicable,vsys4,prodedfl,prodcore,ae1.1512,,Syslog_Server,2022/03/31 18:18:35,0,1,60353,443,0,0,0x0,tcp,deny,66,66,0,1,2022/03/31 18:18:35,0,any,0,7022483376390954281,0x8000000000000000,10.0.0.0-10.255.255.255,Ireland,0,1,0,policy-deny,920,0,0,0,Production,LUM-EVERE-PAFW-R8-17-T1,from-policy,,,0,,0,,N/A,0,0,0,0,2d8c02f8-e86f-43cf-a459-01acdb26580a,0,0,,,,,,,"
| rex "\d{2}:\d{2}:\d{2},(?<src_ip>(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?))\,(?<dst_ip>(?:(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.){3}(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?))\,(?:[^:\n]*:){2}\d+,\d+,\d+,(?<src_port>\d+)\,(?<dst_port>[^\,]+)(?:[^,\n]*,){5}(?<action>[^\,]+)" let me know if this helps! Thanks, Mayur
... View more
08-16-2021
01:31 PM
Hello, The question is pretty straightforward. I would like to alert if 3 failed logins followed by 1 successful login from one user is observed. For example: Minute user action 1st minute xyz failure 2nd minute xyz failure 3rd minute xyz failure 4th minute xyz success If this condition occurs. I would like to create an alert. Thanks in advance
... View more
Labels
- Labels:
-
stats
04-14-2021
11:57 AM
try this: | rex field=url_field "http(|s):\/\/(?<url>[^\/]+)"
... View more
- Tags:
- regex
06-11-2020
08:57 AM
could you please tell us how your output should look like from those sample logs.
... View more
06-10-2020
11:54 AM
1 Karma
try this : | tstats latest(_time) as latest earliest(_time) as earliest where source=*restart.log by source
| eval diff=latest-earliest
| convert ctime(latest) ctime(earliest) ctime(diff)
... View more