Activity Feed
- Got Karma for Why am I getting Invalid lookup table?. 06-05-2020 12:48 AM
- Got Karma for How to set the indexes for a single HTTP event collector input in an indexer cluster?. 06-05-2020 12:48 AM
- Got Karma for Re: Substituting values from MV fields?. 06-05-2020 12:48 AM
- Got Karma for Splunk Add-on for Microsoft Cloud Services with proxy?. 06-05-2020 12:48 AM
- Got Karma for Why are some events not summarized in a data model?. 06-05-2020 12:48 AM
- Got Karma for How do I apply my custom CSS to each page I select from my custom navbar in an app?. 06-05-2020 12:47 AM
- Got Karma for Datamodel with spaces in field names?. 06-05-2020 12:47 AM
- Got Karma for Datamodel with spaces in field names?. 06-05-2020 12:47 AM
- Got Karma for Datamodel with spaces in field names?. 06-05-2020 12:47 AM
- Got Karma for Dashboard table styling?. 06-05-2020 12:47 AM
- Got Karma for Setting up permissions for viewing alerts?. 06-05-2020 12:47 AM
- Posted How do you store the matching value as a field? on Splunk Search. 11-19-2018 08:26 AM
- Posted Re: Why am I getting less search results when reading data from the KV store via inputlookup? on Reporting. 02-16-2018 07:16 AM
- Posted Re: Why am I getting less search results when reading data from the KV store via inputlookup? on Reporting. 02-16-2018 07:14 AM
- Posted Re: Why am I getting less search results when reading data from the KV store via inputlookup? on Reporting. 02-16-2018 07:14 AM
- Posted Why am I getting less search results when reading data from the KV store via inputlookup? on Reporting. 02-14-2018 09:25 AM
- Posted Re: How to set the indexes for a single HTTP event collector input in an indexer cluster? on Deployment Architecture. 01-30-2018 08:30 AM
- Posted Re: How to set the indexes for a single HTTP event collector input in an indexer cluster? on Deployment Architecture. 01-30-2018 07:28 AM
- Posted Using _time as a discriminator without time span? on Splunk Search. 09-18-2017 06:05 AM
- Posted Re: How do I deduplicate events with such conditions? on Splunk Dev. 08-24-2017 12:29 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
1 | |||
0 | |||
0 | |||
0 | |||
1 | |||
0 |
11-19-2018
08:26 AM
Suppose I have a query like:
index=my_index stringA OR stringB OR stringC | table logentry, whatmatched
And for the "whatmatched" field, I would like to have the particular string against my raw data has matched, yielding an output like:
logentry | whatmatched
this is message with stringB | stringB
stringC comes here | stringC
Is it possible to extract this somehow?
... View more
02-16-2018
07:16 AM
Please see my answer below, I wanted to post it here, but overlooked where I'm typing sorry.
... View more
02-16-2018
07:14 AM
It is defined with 13 fields in my collections.conf, all strings like this:
field.field1 = string
field.field2 = string
[...]
accelerated_fields.my_accel = {"field1"}
replicate = false
I refer to it in my transforms.conf like this:
collection = mycollection
external_type = kvstore
fields_list = _key, field1, field2, ...
When the outputlookup search is run, I'm not using explicitly any _key fields, but before that command, the results are narrowed down with the fields command to have only these fields which are used in the KV store as well.
Other KV store lookups, which are defined and used the same way seems to be working, and recently I've had an error where the KV store was unable to initialize, so I'm suspecting this issue is rather with the KV store subsystem itself, not this particular lookup.
I've been searching for that error as well, but the only answer I've found here is about the expired SSL certificate, but that does not apply to my case.
... View more
02-16-2018
07:14 AM
It is defined with 13 fields in my collections.conf, all strings like this:
field.field1 = string
field.field2 = string
[...]
accelerated_fields.my_accel = {"field1"}
replicate = false
I refer to it in my transforms.conf like this:
collection = mycollection
external_type = kvstore
fields_list = _key, field1, field2, ...
When the outputlookup search is run, I'm not using explicitly any _key fields, but before that command, the results are narrowed down with the fields command to have only these fields which are used in the KV store as well.
Other KV store lookups, which are defined and used the same way seems to be working, and recently I've had an error where the KV store was unable to initialize, so I'm suspecting this issue is rather with the KV store subsystem itself, not this particular lookup.
I've been searching for that error as well, but the only answer I've found here is about the expired SSL certificate, but that does not apply to my case.
... View more
02-14-2018
09:25 AM
I have a KV store defined on my search head, which is not replicated.
I'm populating it with data with a scheduled search (outputlookup), which is producing approx 750.000 rows.
When I'm reading data from the KV store via inputlookup, I'm getting only like 1500 rows.
The populating search takes really long time to finish, because of the |outputlookup my_kvstore command at the end of the search.
If I remove this and replace it with an outputcsv, it is much faster.
What are the limitations of KV stores, and how can I troubleshoot this?
I was looking at the mongod.log, but haven't seen any errors.
... View more
01-30-2018
08:30 AM
Yes, I've left it as main, and then changed it to my "real" index in the *.conf file.
... View more
01-30-2018
07:28 AM
yes: first I create the input on the web gui as described in the documentation, and then I go to the HFWD instance via the filesystem (rdp/ssh), /etc/apps/splunk_httpin (not 100% if this is the apps name, something similar)/local/inputs.conf
In this file, you change the index= to the index you want to use.
hope this helps
... View more
09-18-2017
06:05 AM
I want to use the _time field as one of my discriminator fields in a tstats command. I wasn't able to figure out, how to do this, without the time values being rounded/group in some time stamp.
For other fields, when used as discriminators every existing value is displayed as a separate row, but with _time, even if I'm no using any span= with my command, they are grouped somehow.
Obviously, in this case, I have really rare events, that's why I want to have the exact time values here.
... View more
08-24-2017
12:29 AM
Unfortunately not what I need, please see me update on the original post above.
... View more
08-23-2017
01:12 PM
So I got multiple custom datasources, scripts mainly, which are sending events to Splunk on some schedule/recurrence.
I can distinguish every execution of these sources by either a timestamp, or a custom ID, which gets incremented with every execution which is captured in every event. The events always have a proper host field, which also contributes to the "unique key" of an event with unique ID mentioned beforehand. The hosts are attributed with custom fields, this is the third part of something which could be used as uniqe key. These are always present in the events as long as they apply to a given host, and are no longer present when they don't apply to a host.
An example what I mean (every line is a separate event):
hostID=host1, attributeID=attribute1, customid=customid1
hostID=host1, attributeID=attribute2, customid=customid1
hostID=host2, attributeID=attribute1, customid=customid2
hostID=host1, attributeID=attribute1, customid=customid2
(Because of the _time field, these would appear in Splunk in reverse order obviously)
I want to deduplicate such events to always have the data only from the really last execution of a script. Like, from the above example, I want to have only
host2, attribute1, customid2
host1, attribute1, customid2
If I were to use
| dedup hostID, attributeID, customid
It would yield me
- host1, attribute2, customid1
- host2, attribute1, customid2
- host1, attribute1, customid2
The solution my team came up is using
<base search> | eventstats max(customid) as max_customid by hostID | search customid=max_customid
This pretty much does the thing, but I feel this is really not efficient - what would be the right approach do to this?
===EDIT
One given host has multiple events (with multiple attributes) from the same execution of the script.
A more detailed example, let's say I got these events:
hostID=host1, attributeID=attribute1, customid=customid1
hostID=host1, attributeID=attribute2, customid=customid1
hostID=host2, attributeID=attribute1, customid=customid2
hostID=host1, attributeID=attribute1, customid=customid2
hostID=host1, attributeID=attribute3, customid=customid2
hostID=host1, attributeID=attribute4, customid=customid2
hostID=host2, attributeID=attribute3, customid=customid2
I want to keep the below events:
hostID=host2, attributeID=attribute1, customid=customid2
hostID=host1, attributeID=attribute1, customid=customid2
hostID=host1, attributeID=attribute3, customid=customid2
hostID=host1, attributeID=attribute4, customid=customid2
hostID=host2, attributeID=attribute3, customid=customid2
This is the reason I can't use stats first()
... View more
06-09-2017
03:47 AM
1 Karma
I'm trying to use the above add-on to get Azure Audit logs, and I want to use a proxy. Everything is configured according to the docs, but any time the script tries to download the data, I'm getting 10061.
I double checked with wireshark, the script is not even trying to connect to my proxy, instead it tries connecting directly to the Microsoft servers. I'm running on a Windows server.
How can I get this working ?
... View more
05-31-2017
01:22 AM
Already solved here:
https://answers.splunk.com/answers/208682/rest-api-why-am-i-hitting-a-limit-of-100-exported.html
... View more
05-31-2017
12:34 AM
I'm trying to run search via the REST api, to get the contents of a lookup:
| inputlookup my_lookup
I got only 100 rows in the results, which is weird, because my lookup is much bigger. I've checked in limits.conf, I'm using the default setting - 50000 rows.
Is there something with lookups? Any other search queries I did with REST worked as expected. I wasn't able to find indicator for this limitation in the HTTP requests/responses used for the search.
... View more
04-24-2017
01:24 AM
Update:
So, this 407 issue has been sorted, not with authentication, but now Splunk is able to open connections towards the public web, but not in all the cases. Some of my threat feeds in Enterprise Security work nicely, but some of them are still failing. I have no other idea where I could configure the proxy, or what is causing this inconsistent behavior.
... View more
04-24-2017
01:20 AM
I've have downloaded from Splunkbase and applied the Linux secure TA on my Splunk instance, and I've been facing with a strange issue I can't understand.
There are a couple fieldaliases defined in it:
FIELDALIAS-rhost = rhost AS src_ip <<<<<<<<<<
FIELDALIAS-src_user = ruser AS src_user
FIELDALIAS-app = process AS app
FIELDALIAS-vendor_product = process AS vendor_product
FIELDALIAS-dest = host AS dest
FIELDALIAS-dest_host = host AS dest_host
FIELDALIAS-dest_nt_domain = kerberos_domain AS dest_nt_domain
FIELDALIAS-src = src_ip AS src <<<<<<<<<<<<<<
I've marked the problematic ones with the arrows.
So, src_ip is created just as expected. However, src is not, just only in a really few number of events. Doing a quick spotcheck, I had like 25 different values in src_ip, but only 2 in src.
My search looked like simply this:
index=ftp
Going forward, when I've selected a value from src_ip, and run my search like this:
index=ftp src_ip=1.2.3.4
then the src field was created, and it had the 1.2.3.4 value in it. (previously, it was present in src_ip but not in src)
After this, I've created a new props.conf in the local folder of the TA, and added the following line:
FIELDALIAS-rhost = rhost AS src_ip
FIELDALIAS-rhost_test= rhost AS src <<<<<<<
Now it works for 100% of the events, both src_ip and src is created with all the values.
I just can't understand what went wrong with the original configuration.
What am I missing ?
... View more
04-06-2017
04:06 AM
1 Karma
This is great, thanks!
... View more
04-03-2017
08:00 AM
In Enterprise Security, for a drill down action I want to use a field from the notable events, which can have multi valued fields as well.
In this case, if I simply do my_field_in_the_search=$myfield_in_the_notable$ in the drilldown search, the values are presented as follows:
my_field_in_the_search=value1,value2,value3
It cannot be used like this obviously, I would like to achieve something like
my_field_in_the_search=value1 OR my_field_in_the_search=value2, etc...
Is it possible to do this somehow?
... View more
03-17-2017
09:13 AM
1 Karma
I have an accelerated datamodel configured, and if I run a tstats against it, I'm getting the results as expected.
However, if I add summariesonly=t to my tstats, some I get less results. I've tried rebuilding the datamodel, at 100% the same issue happens. Can be something misconfigured, resulting in some of the events not getting summarized ?
... View more
03-17-2017
01:54 AM
I'm running Splunk 6.5.2 on a Windows Server 2012 R2, and I just cannot get the proxy working.
I've tried setting it in splunk-launch.conf, and/or as an environmental variable for both http_proxy and httpS_proxy , but none of them helped, I'm getting Winsock 10061 errors all the time. I've tried both formats: : and http(s)://:.
Besides that, I want to use a couple apps (downloaded from Splunkbase), some of them has their own configuration where I can specify the proxy settings, and I'm getting '407 Proxy Authentication Required' errors.
However, our proxy does not need authentication. I've tried running web requests with the same python modules used in the apps (urllib2, requests), and worked from me.
... View more
03-14-2017
06:52 AM
Thanks, this has been a headache for me for a while 🙂
... View more
03-14-2017
06:32 AM
I have an app installed from Splunkbase, which has custom search command defined in it. I've set the commands to be globally available, and it works fine. I can invoke the commands from any of the apps I have in Splunk, except Enterprise Security.
Is there a way to configure ES to be able to invoke commands from other app's context?
... View more
03-07-2017
01:42 AM
Yes, sorry for being confusing. It's an app downloaded from Splunkbase
... View more
03-06-2017
04:25 AM
I have a command defined in an application provided by a vendor, which I would like to use in another application installed in Splunk, as a workflow action.
I've tried invoking the same command from an inline search, but I'm getting an "Unknown search command" error.
In the application where the command is defined, I've already set the permissions the command to be globally available for everyone:
[]
access = read : [ * ], write : [ admin, power, splunk-system-user ]
export = system
[commands]
export = system
What else do I need to set to make this available ?
... View more