Activity Feed
- Karma Re: How to generate a search that will find values which are hexadecimal only? for cmerriman. 06-05-2020 12:48 AM
- Karma Re: How to generate a search that will find values which are hexadecimal only? for gokadroid. 06-05-2020 12:48 AM
- Karma Re: How to generate a search that will only display results where a field contains some non-alphanumeric characters? for somesoni2. 06-05-2020 12:48 AM
- Karma Re: condense many line response results to a small table for somesoni2. 06-05-2020 12:48 AM
- Got Karma for Re: How to write a search to only return results where multiple values exist?. 06-05-2020 12:48 AM
- Posted How to generate a search that will only display results where a field contains some non-alphanumeric characters? on Splunk Search. 03-23-2017 01:52 PM
- Tagged How to generate a search that will only display results where a field contains some non-alphanumeric characters? on Splunk Search. 03-23-2017 01:52 PM
- Tagged How to generate a search that will only display results where a field contains some non-alphanumeric characters? on Splunk Search. 03-23-2017 01:52 PM
- Tagged How to generate a search that will only display results where a field contains some non-alphanumeric characters? on Splunk Search. 03-23-2017 01:52 PM
- Posted Re: condense many line response results to a small table on Splunk Search. 03-06-2017 02:52 PM
- Posted Re: condense many line response results to a small table on Splunk Search. 03-06-2017 02:49 PM
- Posted condense many line response results to a small table on Splunk Search. 03-06-2017 12:18 PM
- Tagged condense many line response results to a small table on Splunk Search. 03-06-2017 12:18 PM
- Tagged condense many line response results to a small table on Splunk Search. 03-06-2017 12:18 PM
- Tagged condense many line response results to a small table on Splunk Search. 03-06-2017 12:18 PM
- Tagged condense many line response results to a small table on Splunk Search. 03-06-2017 12:18 PM
- Tagged condense many line response results to a small table on Splunk Search. 03-06-2017 12:18 PM
- Posted Re: How to generate a search that will find values which are hexadecimal only? on Splunk Search. 11-28-2016 08:29 AM
- Posted Re: How to generate a search that will find values which are hexadecimal only? on Splunk Search. 11-28-2016 08:05 AM
- Posted How to generate a search that will find values which are hexadecimal only? on Splunk Search. 11-28-2016 07:31 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 |
03-23-2017
01:52 PM
A field is named product. I want to produce a list of products in my source, which are not made up of only english alphanumeric characters (any length).
I.e. Products:
Dog Collar
18 inch Dog Collar
20-inch Dog Collar
Ƨhock collar
would yield only:
20-inch Dog Collar
Ƨhock collar
(Because of the latin character and the hyphen.)
I've seen plenty of uses of sed to remove/replace the non-alphanumerics, but I don't want to remove, just get a list of the outliers.
... View more
03-06-2017
02:52 PM
I asked the same question on StackOverflow if you want to double dip on the points 🙂
... View more
03-06-2017
02:49 PM
This is much better than the direction I was going!
I was trying to use transaction, but couldn't match up the file sizes with the names after the fact.
... View more
03-06-2017
12:18 PM
I'm working with email response data which comes into my index in individual messages. Each email message can have more than 100 entries in the index.
I'm trying to build tables to make the data easy to read.
This is what some simplified and sanitized results from my search look like:
[01:00:22.164297] x=ABC mod=mail cmd=msg rule=ruleQ subject="Test 123" size=8583
[01:00:22.136496] x=ABC mod=spam cmd=run rule=notspam
[01:00:22.106325] x=ABC mod=spam cmd=run policy=outbound
[01:00:22.067675] x=ABC mod=mail cmd=attachment file=text.html size=3347
[01:00:22.039732] x=ABC mod=mail cmd=attachment file=text.txt size=2093
[01:00:22.010986] x=ABC mod=session cmd=data rcpt=personA@rec.org
[01:00:22.010986] x=ABC mod=session cmd=data rcpt=personB@rec.org
[01:00:22.000234] x=ABC mod=mail sender=noreply@sender.org
Tabled to show how data is structured for columns I care about:
╔═══════════════════╦═════╦══════════╦═══════════╦════════════════════╦═════════════════╦══════╦═════════╗
║ time ║ x ║ subject ║ file ║ sender ║ rcpt ║ size ║ rule ║
╠═══════════════════╬═════╬══════════╬═══════════╬════════════════════╬═════════════════╬══════╬═════════╣
║ [01:00:22.164297] ║ ABC ║ Test 123 ║ ║ ║ ║ 8583 ║ ruleQ ║
║ [01:00:22.136496] ║ ABC ║ ║ ║ ║ ║ ║ notspam ║
║ [01:00:22.106325] ║ ABC ║ ║ ║ ║ ║ ║ ║
║ [01:00:22.067675] ║ ABC ║ ║ text.html ║ ║ ║ 3347 ║ ║
║ [01:00:22.039732] ║ ABC ║ ║ text.txt ║ ║ ║ 2093 ║ ║
║ [01:00:22.010986] ║ ABC ║ ║ ║ ║ personA@rec.org ║ ║ ║
║ [01:00:22.010986] ║ ABC ║ ║ ║ ║ personB@rec.org ║ ║ ║
║ [01:00:22.000234] ║ ABC ║ ║ ║ noreply@sender.org ║ ║ ║ ║
╚═══════════════════╩═════╩══════════╩═══════════╩════════════════════╩═════════════════╩══════╩═════════╝
This is what I'd like to get back:
╔═══════════════════╦═════╦══════════╦═══════════╦════════════════════╦═════════════════╦══════╦═════════╗
║ time ║ x ║ subject ║ file ║ sender ║ rcpt ║ size ║ rule ║
╠═══════════════════╬═════╬══════════╬═══════════╬════════════════════╬═════════════════╬══════╬═════════╣
║ [01:00:22.164297] ║ ABC ║ Test 123 ║ text.html ║ noreply@sender.org ║ personA@rec.org ║ 3347 ║ notspam ║
║ [01:00:22.164297] ║ ABC ║ Test 123 ║ text.txt ║ noreply@sender.org ║ personA@rec.org ║ 2093 ║ notspam ║
║ [01:00:22.164297] ║ ABC ║ Test 123 ║ text.html ║ noreply@sender.org ║ personB@rec.org ║ 3347 ║ notspam ║
║ [01:00:22.164297] ║ ABC ║ Test 123 ║ text.txt ║ noreply@sender.org ║ personB@rec.org ║ 2093 ║ notspam ║
╚═══════════════════╩═════╩══════════╩═══════════╩════════════════════╩═════════════════╩══════╩═════════╝
As you can see, the transformations I want for the data include:
creating a unique row for each person receiving each attachment
the size value is for the attachment, while the size of the whole
message is dropped
The time from the entry which contains the subject name is used for each entry
The 'rule' from mod=spam AND rule!=null fills in the rule column for
each entry, and the rule from the line which contains subject is ignored
The subject, sender and rule get copied to every entry
... View more
11-28-2016
08:29 AM
The latter worked for me, since there is no 0x preceding the values. Thanks!
... View more
11-28-2016
08:05 AM
I'm still getting all the values for the field.
I piped this in right before my call to stats, and my tables are still full of both hex and non-hex values.:
search |rex field=devicename"(?[0-9a-fA-F]{13})" | stats values(devicename) as devices by user |where mvcount(devices)>1
I'm trying to get a list of users who have more than 1 device assigned which has a hexadecimal device name, along with the names of the hexadecimal devices.
... View more
11-28-2016
07:31 AM
I have a query which returns a field which is occasionally a 13-digit hexadecimal value, and occasionally a string which may or may not be 13 characters long. I'd like to create an output of just the items which match hex.
There is lots written on converting hex, but I want to leave it intact for the search.
Thanks!
... View more
11-22-2016
02:11 PM
This seems so obvious now that I see it. Thank you!
... View more
11-22-2016
02:11 PM
1 Karma
This was a good solution, but the other was a little simpler. Thanks for your time!
... View more
11-22-2016
01:17 PM
I have a log output which provides many fields, but the two I'm most concerned with are user and device.
I'm trying to output a list of users and devices which corresponds with the user from the log data, only in the event that the user has more than one device associated: I.e.:
JohnSmith DeviceA
JohnSmith DeviceB
SteveSmith DeviceB
SteveSmith DeviceC
TedSmith DeviceX
TedSmith DeviceY
TedSmith DeviceZ
I don't care about users who have only one device assigned, and want to focus on users where the distinct count of devices >1.
... View more
05-23-2016
07:53 AM
Your method would definitely work, but it the comparison file may change on each run.
I'd like to find a way to do it programmatically, so I don't have to log in separately to splunk to modify the inputfile and props.conf every time the job runs (which will likely be hourly).
... View more
05-20-2016
02:20 PM
Since the row has customer_name=abc, you can actually call for customer_name in the search, and it will include the value. Splunk is really smart that way.
As for the IP address, try this regex instead:
| rex field=_raw "(?<ipaddress>(\d{1,3}\.){3}\d{1,3})" |stats COUNT by ipaddress, customer_name
The regex looks for 1-3 digits followed by a '.' 3 times, followed by 1-3 digits, and assigns it to a newly created variable called ipaddress.
... View more
05-20-2016
02:03 PM
I'm trying to run a search where I will get results if a field matches one of many predetermined values and I'm worried about the logistics and resources in processing a large number of OR clauses.
I'm building an external app that is making calls to Splunk through the Python SDK, and I've found searching for a few expressions is pretty basic:
kwargs_oneshot = {"earliest_time": "-1h","latest_time": "now"}
searchquery_oneshot = "search index=foo sourcetype=bar (url=google* OR url=yahoo* OR url=splunk*)"
oneshotsearch_results = service.jobs.oneshot(searchquery_oneshot, **kwargs_oneshot)
reader = results.ResultsReader(oneshotsearch_results)
But what to do if the external data source that I want to put in my script provides me with 1000+ urls?
I think a lookup table is probably the best route, but I can't seem to find any way to programmatically upload it as part of the oneshot, and the list of URLs will likely be different each time I execute the script from my application which resides on a different server. I don't want to have to ssh into my Splunk server before I run the job, especially since the external app will run autonomously.
Is there a way to upload a file as part of the oneshot API call, so I can just do a lookup against the index?
I guess a more relevant question is probably should I bother even? Would a set of 1000 OR clauses actually run faster than the lookup?
... View more