Getting Data In

Exclude log rows from journald input

jni
Explorer

Hi,

I'm ingesting journald logdata, and would like to exclude all rows with "apparmor=ALLOW".

To me, the journald-filter parameter would do the trick, if I can invert the selection, i.e. "grep -v"

 

Is this possible, or is there another way to do this, without adding everything else to the journald-filter-parameter?

I'm using UF and Enterprise 9.4.1

TIA

Johan Nilsson

Labels (3)
0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @jni 

On a UF your options are limited as the others have shared, because the journald input configs are inclusive not negative checks, if you dont have a HF or dont want to send this data from the UF then I think the only other thing you could look as is the lesser-well-known 'force_local_processing' flag in props.conf - Note that this makes the UF process a bit like a HF and may increase CPU usage.

For more info check out https://docs.splunk.com/Documentation/Splunk/8.2.12/Admin/Propsconf#:~:text=force_local_processing

Example config:

## props.conf ##

[yourSourcetype]
force_local_processing = true
TRANSFORMS-drop_apparmor = drop_apparmor_allow

## transforms.conf ##
[drop_apparmor_allow]
REGEX = apparmor=ALLOWED
DEST_KEY = queue
FORMAT = nullQueue

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

richgalloway
SplunkTrust
SplunkTrust

Have you tried this filter?

JournalFilter = NOT MESSAGE~"apparmor=ALLOW"

If that doesn't work then you can use props and transforms or Ingest Actions to drop the unwanted events in HF or indexer.  Props and transforms work only in the first full instance (HF or indexer) that the data passes through; Ingest Actions work in any full instance.

---
If this reply helps you, Karma would be appreciated.

PickleRick
SplunkTrust
SplunkTrust

The option for jounald input is

journalctl-filter = <string>
* These settings map directly to the arguments for the journalctl command.
  See the documentation for journalctl.
* Default: none

But in this case it might be just

journalctl-grep = <string>
* Equivalent to ‘-g’ parameter of journalctl; filter output to entries
  where the MESSAGE= field matches the specified regular expression.
  PERL-compatible regular expressions are used
* Default: none

Unfortunately, those options are "inclusive filters", not exclusion ones so it might be tricky since the filters do not support wildcards either (which is frustrating if you want to just include entries which have a specific field set to anything when other events simply do not have this field).

So unfortunately, the only way to go can be indeed to ingest everything with the UF and then filter out on the "heavie". (which of course means that you need to haul a lot of unnecessary data over the network; tough luck).

It's not specific to Splunk's input, it's rather how journalctl works.

jni
Explorer

Hi @richgalloway !

 

Thanks for the suggestion, but that didn't help. I guess regex it is 😞 

 

Thanks and best regards

Johan

0 Karma

inventsekar
SplunkTrust
SplunkTrust

Hi @jni   May i know if you are using the Heavy Forwarder?The UF is a light weight agent and it got the least features. To filter specific lines while excluding others at the Splunk Universal Forwarder (UF) level, you cannot use typical props.conf line-filtering on the UF itself.

Instead, you must use regex in inputs.conf to filter files or send data to an Indexer/Heavy Forwarder to use transforms.conf to create a nullQueue

----------------------------------------------------------------------------------------------
If this post or any post addressed your question, could you pls:
Give it karma to show appreciation

PS - As of May 2026, my Karma Given is 2312 and my Karma Received is 497, lets revamp the Karma Culture!
Thanks and best regards, Sekar
--------------------------------------------------------------------------------------------

jni
Explorer

Hi @inventsekar !

 

I'm using a UF on the client, where I'd like to get rid of the journald log lines. 

So, if I understand you correctly : I need to send all journald log-data to the HF/indexer, and there use transforms to drop the lines? 

 

Thanks for your help!

Best regards

Johan

0 Karma

inventsekar
SplunkTrust
SplunkTrust

Hi @jni 


So, if I understand you correctly : I need to send all journald log-data to the HF/indexer, and there use transforms to drop the lines? 


Yes, exactly. or you can use inputs.conf on the HF.

to summarize, the two ideas are:
1) On the HF or Indexer, you can use regex in inputs.conf to filter lines

OR

2) send the log to an Indexer/Heavy Forwarder to use props.conf and transforms.conf to create a nullQueue (to drop the selected events)

Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Automating Threat Operations and Threat Hunting with Recorded Future

    Automating Threat Operations and Threat Hunting with Recorded Future June 29, 2026 | Register   Is your ...

Keep the Learning Going with the New Best of .conf Hub

Hello Splunkers, With .conf26 getting closer, there’s already a lot of excitement building around this year’s ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...