Activity Feed
- Posted Re: update transforms.conf and props.conf from an app on Splunk Enterprise. 2 weeks ago
- Karma Re: update transforms.conf and props.conf from an app for isoutamo. 2 weeks ago
- Karma Re: add icon to app for kamlesh_vaghela. 2 weeks ago
- Karma Re: update transforms.conf and props.conf from an app for isoutamo. 2 weeks ago
- Posted Re: update transforms.conf and props.conf from an app on Splunk Enterprise. 3 weeks ago
- Posted Re: update transforms.conf and props.conf from an app on Splunk Enterprise. 3 weeks ago
- Karma Re: update transforms.conf and props.conf from an app for isoutamo. 3 weeks ago
- Posted Re: update transforms.conf and props.conf from an app on Splunk Enterprise. 3 weeks ago
- Karma Re: update transforms.conf and props.conf from an app for livehybrid. 3 weeks ago
- Posted update transforms.conf and props.conf from an app on Splunk Enterprise. 3 weeks ago
- Posted Re: Force inclusion of space character as a first character in FORMAT string in transforms.conf on Getting Data In. 01-20-2025 12:08 AM
- Karma Re: Force inclusion of space character as a first character in FORMAT string in transforms.conf for tscroggins. 01-19-2025 10:31 PM
- Got Karma for Re: Force inclusion of space character as a first character in FORMAT string in transforms.conf. 01-19-2025 12:55 PM
- Posted Re: conditional whitespace in transform on Getting Data In. 01-19-2025 06:51 AM
- Posted Re: Force inclusion of space character as a first character in FORMAT string in transforms.conf on Getting Data In. 01-19-2025 06:45 AM
- Karma Re: Force inclusion of space character as a first character in FORMAT string in transforms.conf for tscroggins. 01-19-2025 06:42 AM
- Posted Re: Force inclusion of space character as a first character in FORMAT string in transforms.conf on Getting Data In. 01-19-2025 01:36 AM
- Posted Force inclusion of space character as a first character in FORMAT string in transforms.conf on Getting Data In. 01-18-2025 08:31 AM
- Posted Re: conditional whitespace in transform on Getting Data In. 01-17-2025 01:08 PM
- Posted Re: conditional whitespace in transform on Getting Data In. 01-17-2025 02:55 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 |
2 weeks ago
We took a few steps and looked at how the config files worked. It seemed as if the content of the different config types were virtually merged (each type of config with its own kind). Therefore we reasoned that we could use the settings we were setting up via the GUI for the forwarder, and use our outputs.conf from the app, to add/override the settings we needed, and it turned out that this approach works! So now we have the possibility to set up the forwarding via the Web UI, and also have those settings augmented with our own extra settings. This seems to solve our initial problem.
... View more
3 weeks ago
I tried putting my props.conf and transforms.conf to $SPLUNK_HOME/etc/apps/yourAppName/local/ but the settings don't seem to take effect for some reason. I created a tcpout destination from the web UI, but it nevertheless tries to send stuff over S2S, disregarding the things I've set in transforms.conf. Though I have to admit, I need to have something like this in the outputs.conf: #Because audit trail is protected and we can't transform it we can not use default we must use tcp_routing
[tcpout]
defaultGroup = NoForwarding
[tcpout:nexthop]
server = localhost:9000
sendCookedData = false But if I set up the destination from the Forwarding and receiving page, then I get something like this: [tcpout]
defaultGroup = default-autolb-group
[tcpout:default-autolb-group]
server = localhost:9000
[tcpout-server://localhost:9000]
... View more
3 weeks ago
I remember being able to install apps from a zip file from the web GUI on physical Splunk installations. On the other hand, I got an idea. It may be a stupid question , but is it possible to configure a tcpout output on the Splunk web UI? If yes, then there is no need for a separate second app. Then I would only need to add the transforms and props configs, and have the users configure the tcpout on their own, and that's it. Just to give you an idea, I want to package something similar to the SC4S heavy forwarder configs described here: https://splunk.github.io/splunk-connect-for-syslog/main/sources/vendor/Splunk/heavyforwarder/
... View more
3 weeks ago
@livehybridand @isoutamo Thanks for both of your answers! I more or less know what I need to put in those files, so have that part figured out already. Yes, as far as my understanding goes, the app is supposed to go on a heavy forwarder node. We have no plans in place for using a deployment server. For the initial POC phase, I believe that adding the app az a simple zip file would suffice. As for outputs.conf, is it possible to somehow dinamically generate its content? I mean, to ask the user for a hostname or IP address, and then use that value for the server value. I will try adding my configs to the app, and will report back in a few days. Now looking at the page https://dev.splunk.com/enterprise/docs/developapps/extensionpoints It does mention props.conf and transforms.conf, but there seems to be no mention of outputs.conf. Is it possible to have an outputs.conf in the app, and force Splunk to somehow use it regardless of not being present in the list above?
... View more
3 weeks ago
Dear Members, I have a use case where I would need to update or insert configuration to transforms.conf, props.conf and outputs.conf. I was told that it is possible to do this via a creating an app. That would make it easier for users to make the necessary changes, instead of doing it via the error-prone manual procedure. Nevertheless, I haven't come across any documentation that would illustrate and explain how to do it. Does someone have any experience with that? Or perhaps can someone point me to the relevant documentation? Thanks in advance!
... View more
Labels
- Labels:
-
configuration
01-20-2025
12:08 AM
Thanks for the explanation, it was educational. I have corrected the plus operators where appropriate. On the other hand, I have now triple-checked, and in deed, multiple leading whitespaces are ignored in the FORMAT string. But yes, it would seem that Splunk or whoever wrote the SC4S config assumed that they would be honored.
... View more
01-19-2025
06:51 AM
In the end, after struggling with this for several days, I thought to myself, this is leading me nowhere. I wanted to have the fields to follow each other like this: (TIME)(SUBSECOND) (HOST) I had the idea to concentrate on adding the whitespace before HOST, and not after TIME or SUBSECOND. This approach had also its problems because spaces at the start of the FORMAT string seem to be ignored, but here I managed to get around that: https://community.splunk.com/t5/Getting-Data-In/Force-inclusion-of-space-character-as-a-first-character-in/m-p/709157 This way I could let go the question of addomg whitespaces conditionally. Though I could not solve this very problem, my overall problam is now solved.
... View more
01-19-2025
06:45 AM
1 Karma
@tscrogginsThanks for the idea. It worked, though I added my own set of modifications to it. As a final touch I would like to put the relevant part of my config here, so as to contribute it back to the community: [md_host]
INGEST_EVAL = _raw=" _h=".host." "._raw
[md_subsecond]
SOURCE_KEY = _meta
REGEX = _subsecond=(\.\d+)
FORMAT = $1$0
DEST_KEY = _raw
[md_time]
SOURCE_KEY = _time
REGEX = (.*)
FORMAT = _ts=$1$0
DEST_KEY = _raw
... View more
01-19-2025
01:36 AM
Good idea, I tried it, but unfortunately it doesn't seem to work. I have this configured: [md_host]
SOURCE_KEY = MetaData:Host
REGEX = ^host::(.*)$
FORMAT = \ _h=$1 $0
DEST_KEY = _raw
[md_subsecond_default]
SOURCE_KEY = _meta
REGEX = _subsecond=(\.\d+)
FORMAT = $1$0
DEST_KEY = _raw
[md_time_default]
SOURCE_KEY = _time
REGEX = (.*)
FORMAT = _ts=$1$0
DEST_KEY = _raw And I get this: 0x0040: e073 339e e073 339e 3232 3738 205f 7473 .s3..s3.2278._ts
0x0050: 3d31 3733 3732 3739 3038 315c 205f 683d =1737279081\._h=
0x0060: 7370 6c75 6e6b 2d68 6620 5f69 6478 3d5f splunk-hf._idx=_ But I agree, this would have been the most elegant solution.
... View more
01-18-2025
08:31 AM
Hello everyone! I am experimenting with the SC4S transforms that are posted here: https://splunk.github.io/splunk-connect-for-syslog/main/sources/vendor/Splunk/heavyforwarder/ My problem is that I am trying to reformat fields, and in one particular place I would need to ensure that a space preceeds the _h= part in the transform stanza below. [md_host]
SOURCE_KEY = MetaData:Host
REGEX = ^host::(.*)$
FORMAT = _h=$1 $0
DEST_KEY = _raw However if I add multiple whitespaces in the FORMAT string, right after the equals sign in the above example, they will be ignored. Should put the whole thing betweem quotes? Wouldn't the quotes be included in the _raw string? What would be the right solution for this?
... View more
Labels
- Labels:
-
transforms.conf
01-17-2025
01:08 PM
I have been experimenting further, and found the following... This is my latest test config: [md_time]
SOURCE_KEY = _time
REGEX = (.*)
FORMAT = _ts=$1
DEST_KEY = time_temp
[md_subsecond]
SOURCE_KEY = _meta
REGEX = _subsecond=(\.\d+)
FORMAT = $1
DEST_KEY = subsecond_temp
[md_fix_subsecond]
INGEST_EVAL = _raw=if(isnull(time_temp), "aaa" . _raw, "bbb" . _raw)
#INGEST_EVAL = _raw=if(isnull(subsecond_temp), time_temp . " " . _raw, time_temp . subsecond_temp . " " . _raw) Both md_time and md_subsecond are in the list, and before md_fix_subsecond. If in md_fix_subsecond I check for the null-ness of either time_temp or subsecond_temp, then they are both reported as null. So for some reason they are not available in the EVAL_RAW for some reason. And as they are both null, referencing them resulted in an error, and so no log was output. How could we resolve this?
... View more
01-17-2025
02:55 AM
I even set up the Windows box to emit metadata that matches the regex in the transform, because none of my logs seemed to have that subsecond data. Now my Windows test machine sends among its metadata the string time_subsecond=.123456 And interestingly enough the subsecond transform doesn't get triggered. My latest version is: [metadata_subsecond]
SOURCE_KEY = _meta
REGEX = _subsecond::(\.[0-9]+)
FORMAT = $1$0
DEST_KEY = _raw But nothing seems to happen.
... View more
01-16-2025
02:27 AM
[md_time]
SOURCE_KEY = _time
REGEX = (.*)
FORMAT = _ts=$1
DEST_KEY = time_temp
[md_subsecond]
SOURCE_KEY = _meta
REGEX = _subsecond::(\.\d+)
FORMAT = $1
DEST_KEY = subsecond_temp
[md_fix_subsecond]
INGEST_EVAL = _raw=if(isnull(subsecond_temp), time_temp + " " + _raw, time_temp + subsecond_temp + " " + _raw)
[md_time_default]
SOURCE_KEY = _time
REGEX = (.*)
FORMAT = _ts=$1 $0
DEST_KEY = _raw The problem seems to be somewhere in md_time, md_subsecond or md_fix_subsecond, because if I use md_time_default, it works (though without subseconds), and if I enable these three instead of md_time_default, then I get no output: the packets emitted by Splunk seem to be empty: without a payload.
... View more
01-15-2025
09:40 AM
I tried what you suggested, but did not seem to help. It seemed as if the fix_subsecond stanza wouldn't be executed at all. The _h KV pair followed _ts's value without a whitespace. After experimenting a bit more, I now have this, but this doesn't work either: [md_time]
SOURCE_KEY = _time
REGEX = (.*)
FORMAT = _ts=$1 $0
DEST_KEY = time_temp
[md_subsecond]
SOURCE_KEY = _meta
REGEX = _subsecond::(\.\d+)
FORMAT = $1
DEST_KEY = subsecond_temp
[md_fix_subsecond]
INGEST_EVAL = _raw=if(isnull(subsecond_temp),time_temp+" "+_raw,time_temp+subsecond_temp+" "+_raw) Plus props.conf: [default]
ADD_EXTRA_TIME_FIELDS = none
ANNOTATE_PUNCT = false
SHOULD_LINEMERGE = false
TRANSFORMS-zza-syslog = syslog_canforward, reformat_metadata, md_add_separator, md_source, md_sourcetype, md_index, md_host, md_subsecond, md_time, md_fix_subsecond, discard_empty_msg
# The following applies for TCP destinations where the IETF frame is required
TRANSFORMS-zzz-syslog = syslog_octet_count, octet_count_prepend
# Comment out the above and uncomment the following for udp
#TRANSFORMS-zzz-syslog-udp = syslog_octet_count, octet_count_prepend, discard_empty_msg
[audittrail]
# We can't transform this source type its protected
TRANSFORMS-zza-syslog =
TRANSFORMS-zzz-syslog = However this now breaks logging and I'm getting no logs forwarded to sylsog-ng. The connection is up, but no meaningful data arrives, just "empty" packages. What may be the problem? Did I break the sequence of the stanzas? (I did not seem to understand it in the first place, as they seem to be in backward order compared to how the KV pairs follow each other in the actual log message.)
... View more
01-15-2025
04:23 AM
Yes, I noticed the excess escaping too, but if it's incorrect then the SC4S suggested config is wrong too. Either way, I will try it both ways, and get back to you. Thank you so much for your help!
... View more
01-15-2025
12:42 AM
Yes, it is in deed. I thought of that, but I assume that the creators at SC4S probably wanted the timestamp to have the fraction seconds added to it, if there is a metadata variable holding that. In that case, the decimal point and the fraction seconds need to follow the timestamp without any whitespace. That is the reason the whitespace is missing where you pointed it out. If there isn't a variable holding the fraction seconds however, like in my case, there will be no trailing space added to the timestamp, and the host key-value pair will follow it right without a whitespace. Any idea how I could add a whitespace conditionally?
... View more
01-14-2025
09:44 PM
Thanks for your help with this. In the meantime I've run into another problem. Could you please help me? This is the topic: https://community.splunk.com/t5/Getting-Data-In/conditional-whitespace-in-transform/m-p/708831
... View more
01-14-2025
09:42 PM
Hello everyone! I am experimenting with the SC4S transforms that are posted here: https://splunk.github.io/splunk-connect-for-syslog/main/sources/vendor/Splunk/heavyforwarder/ My goal is to send the logs to a syslog-ng instance running with a custom config. My current problem is that the SC4S config contains a part where it checks for subseconds, and appends the value to the timestamp, if found. [metadata_source]
SOURCE_KEY = MetaData:Source
REGEX = ^source::(.*)$
FORMAT = _s=$1 $0
DEST_KEY = _raw
[metadata_sourcetype]
SOURCE_KEY = MetaData:Sourcetype
REGEX = ^sourcetype::(.*)$
FORMAT = _st=$1 $0
DEST_KEY = _raw
[metadata_index]
SOURCE_KEY = _MetaData:Index
REGEX = (.*)
FORMAT = _idx=$1 $0
DEST_KEY = _raw
[metadata_host]
SOURCE_KEY = MetaData:Host
REGEX = ^host::(.*)$
FORMAT = _h=$1 $0
DEST_KEY = _raw
[metadata_time]
SOURCE_KEY = _time
REGEX = (.*)
FORMAT = _ts=$1$0
DEST_KEY = _raw
[metadata_subsecond]
SOURCE_KEY = _meta
REGEX = \_subsecond\:\:(\.\d+)
FORMAT = $1 $0
DEST_KEY = _raw In my case however, when it's not found, the timestamp field will not get a whitespace appended, and thus it will be practically concatenated with the following field, which is not what I want. How could set up the config so that there will always be a whitespace before the next field (the host/_h field)? I tried adding an extra whitespace in front of the _h in the FORMAT part of the metadata_host stanza, but that seems to be ignored. This is what I see: 05:58:07.270973 lo In ifindex 1 00:00:00:00:00:00 ethertype IPv4 (0x0800), length 16712: (tos 0x0, ttl 64, id 49071, offset 0, flags [DF], proto TCP (6), length 16692)
127.0.0.1.49916 > 127.0.0.1.cslistener: Flags [.], cksum 0x3f29 (incorrect -> 0x5743), seq 1:16641, ack 1, win 260, options [nop,nop,TS val 804630966 ecr 804630966], length 16640
0x0000: 0800 0000 0000 0001 0304 0006 0000 0000 ................
0x0010: 0000 0000 4500 4134 bfaf 4000 4006 3c12 ....E.A4..@.@.<.
0x0020: 7f00 0001 7f00 0001 c2fc 2328 021a 7392 ..........#(..s.
0x0030: 486d 209f 8010 0104 3f29 0000 0101 080a Hm......?)......
0x0040: 2ff5 b1b6 2ff5 b1b6 5f74 733d 3137 3336 /.../..._ts=1736
0x0050: 3931 3730 3739 5f68 3d73 706c 756e 6b2d 917079_h=splunk-
0x0060: 6866 205f 6964 783d 5f6d 6574 7269 6373 hf._idx=_metrics
0x0070: 205f 7374 3d73 706c 756e 6b5f 696e 7472 ._st=splunk_intr This is the interesting part: 0x0040: 2ff5 b1b6 2ff5 b1b6 5f74 733d 3137 3336 /.../..._ts=1736
0x0050: 3931 3730 3739 5f68 3d73 706c 756e 6b2d 917079_h=splunk-
0x0060: 6866 205f 6964 783d 5f6d 6574 7269 6373 hf._idx=_metrics The _h will come right after the end of the _ts field, without any clear separation.
... View more
Labels
- Labels:
-
transforms.conf
01-14-2025
01:24 AM
Thanks! That is understandable. Based on your answers so far, I will think through what would work best, and will get back to you. But either way, I think I got all the answers I needed.
... View more
01-14-2025
12:40 AM
Thanks for clarifying. You helped a lot! That means there are two options for me: do this conversion on the syslog-ng side and that won't hurt the splunk side of things forward the logs to yet another splunk instance that will only do this conversion, thereby isolating the "production" Splunk instance from these transforms
... View more
01-14-2025
12:12 AM
Okay, I reverted to using INGEST_EVAL, that works as well. On the other hand, I have an additional question: If a given Splunk node is already forwarding logs to another node over S2S or S2S over HEC, and I want to add this configuration to send the logs to yet another destination (a node running syslog-ng), then will this configuration break the other pre-existing destinations' log format? Or is it safe to use from this perspective?
... View more
01-13-2025
11:01 AM
I can confirm, this type of setup does not work for the Windows logs: [sanitize_metadata]
EVAL-EEEE =replace(_meta,"::","=")
[metadata_meta]
SOURCE_KEY = EEEE
REGEX = (?ims)(.*)
FORMAT = $1__-__$0
DEST_KEY = _raw The problem is that with this the Windows logs only contain the eventlog message part, as if they did not have any metadata attached.
... View more
01-13-2025
10:41 AM
I have played around a bit more... This is what seems to be working for me: [sanitize_metadata]
EVAL-_meta=replace(_meta,"::","=")
[metadata_meta]
SOURCE_KEY = _meta
REGEX = (?ims)(.*)
FORMAT = $1__-__$0
DEST_KEY = _raw Note: __-__ is just a placeholder for a separator. I found an article that is aiming at a marginally similar thing as I do: https://zchandikaz.medium.com/alter-splunk-data-at-indexing-time-a10c09713f51 There, the individual uses EVAL instead of INGEST_EVAL. Is there any significant difference? Also, I changed your example because it worked differently if I did not use _meta as a target variable in the INGEST_EVAL. I noticed that with your version, the logs that originated from the Windows machine with the UF on it, were missing the metadata assigned there. When I use my version, all the metadata set on the UF (static key-value pairs) is there in the log. Any idea why that might be? Either way, thanks so much for your effort to help me! I really appreciate it!
... View more
01-13-2025
08:26 AM
No, it's a custom configured syslog-ng instance. that I set up. After looking at the logs arriving, I saw that the logs that previously had the metadata part included, now have nothing instead and the separators (~~~EM~~~ and ~~~SM~~~) are missing too.
... View more
01-13-2025
08:01 AM
I am forwarding the logs from the Splunk HF to a syslog-ng instance, that I configured myself so it doesn't matter here.
... View more