Try this:
| makeresults
| eval host="host3.CA.domain.com"
| eval host=if(match(host, "^\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3}$"), host, replace(host, "^([^\.]+)\..*$", "\1"))
More explanation here in the docs, explanation of the regex here.
... View more
Did you try setting KV_MODE=JSON in the corresponding sourcetype in props.conf?
That should actually extract fields from JSON on it's own.
To access certain fields without changing KV_MODE, take a look at | spath here:
http://docs.splunk.com/Documentation/Splunk/6.3.3/SearchReference/Spath
... View more
Can you please add some details to your question? I don't really get what are trying to do, so maybe adding screenshots, example data etc. makes it easier to grasp it.
... View more
Well, the message is already saying a lot about your problem.
You're running into the configured maximum limit for historical searches.
You can check the Monitoring Console to see how many searches are run, how many are being skipped, and which, and some other information.
There's a lot of possible reasons, and you can find a lot information when using Google on "Splunk maximum number of concurrent historical scheduled searches", including a few good blog posts on this very topic - check them out! 🙂
... View more
I would really advise not to do this.
The Windows Linux Subsystem is nice, but it's far from perfect, and you will run into a bunch of problems using it with something as advanced as Splunk. It lacks a lot of more advanced features, and it is not worth the trouble you'll have with it - instead just spin up a VM 😉
Edit: Sorry for thread necro, it popped up in and I didn't check the timestamp 0=)
... View more
Ah, didn't know it was possible, rarely use the GUI. I fear without actual access troubleshooting this is difficult - maybe you can find any errors in index=_internal ?
... View more
You could either go with crcSalt or initCrcLen .
As your filenames keep changing, the easiest would be a inputs.conf like this:
[monitor:yourfilename]
crcSalt = <SOURCE>
It will just use the (always different) filename as a salt, so the checksum will differ for each new file - that should solve your problem.
If you had the same issue, but the filename would always be the same, you would have to raise the initCrcLen up to the point where the file is actually different.
... View more
Did you add it via GUI? The FORMAT = $1::$2 is essential, else it will most likely not return anything.
I tried that regex here with your sample data, so at least the regex should be fine:
https://regex101.com/r/5JcfIv/1
... View more
Two hints:
The line | search CVE= "*" contains a space, that might cause trouble.
The sort function has an implicit limit of 10000, so you might not get all results. Improve this by using | sort 0 -CVE .
... View more
You should not remove these during indexing, because it will most likely break all your field extractions unless all these information has been extracted as index time fields, which it most likely isn't.
You could use a regex like this:
.*\smessage=(?<_raw>.*)$
This would replace the _raw field, which is what you're getting displayed as the actual event text.
So, you can simply set up a props.conf like this:
[your-sourcetype]
EXTRACT-shorten_raw_text = .*\smessage=(?<_raw>.*)$
Hope that helps - if it does I'd be happy if you would upvote/accept this answer, so others could profit from it. 🙂
... View more
I've no experience on Splunk Cloud, but on a on premise installation, you would have to do it via config files - no way to do this via the GUI. So unless Splunk Cloud doesn't offer something special for this case, I guess your way is through support then.
... View more
Hey,
a proper regex would be \skey="([^"]+)" .
Put it in your transforms, set MV_ADD = true , and you should be good.
However, this might be helpful, too: <metaData\s+key="([^"]+)">((?!<\/metaData>).+)<\/metaData>
You could then set FORMAT = $1::$2 so you get fields corresponding with the key names and their proper values.
You could also do both, like this:
props.conf
[your-sourcetype]
REPORT-metadata-fields = metadata-keys-mv-field, metadata-key-value-fields
transforms.conf
[metadata-keys-mv-field]
REGEX = <metaData\s+key="(?<metadata_keys>[^"]+)"
MV_ADD = true
[metadata-key-value-fields]
REGEX = <metaData\s+key="([^"]+)">((?!<\/metaData>).+)<\/metaData>
FORMAT = $1::$2
MV_ADD = true
Hope that helps - if it does I'd be happy if you would upvote/accept this answer, so others could profit from it. 🙂
... View more
Reposting the comment of @tonniea so it doesn't get lost 🙂
In the field definition of the datamodel for your rex field you find this:
"calculationType":"Rex","expression":"(?[^<]?)<"},{"outputFields":[{"fieldName":"cpu_load","owner":"cpu2","type":"string","fieldSearch":"","required":false,"multivalue":false*,"hidden":false...etc
If you change the "multivalue" attribute to true, import the datamodel and restart Splunk this appears to be working as intended.
... View more
As always, "it depends".
If the existing inputs.conf is located in etc/system/local/ (or worse, etc/system/default/ ), you cannot modify it via Deployment server, because DS only deploys to the etc/apps/ directory. (besides some rather ugly hacks using scripted inputs)
If you however have an inputs.conf in an app, you can simply recreate that app on the DS in etc/deployment-apps/yourapp and then distribute it to the forwarders (assuming you configured the DS IP/hostname with those forwarders).
Be aware that you need to recreate the whole app before distributing it via DS, because all files in that app that only exist on the Forwarder, but not the DS will be removed.
Hope that helps - if it does I'd be happy if you would upvote/accept this answer, so others could profit from it. 🙂
... View more
Good to know. Now that you have the details at hand, you could post an answer to your own question so others Googling for it can profit of your experience 😉
... View more
Some example data from the Splunk tutorial:
http://docs.splunk.com/Documentation/Splunk/7.1.0/SearchTutorial/Systemrequirements#Download_the_tutorial_data_files
Some Airline example data:
https://www.transtats.bts.gov/Tables.asp?DB_ID=120
Bunch of datasets from Amazon:
https://registry.opendata.aws/
Good luck 😉
... View more