Getting Data In

Delete line with all hex 0 ie \x00

65pony
Explorer

We have a very strange file where the first line has hundreds of \x00 values.
ex. the following times 50....

\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00

I tried want to get rid of this line but do not seem to be having any luck in my transforms.
Here are the REGEX's that I have tried with no luck.... any suggestions?

REGEX = ^0x00+ , \x00 , ^\x00  
DEST_KEY = queue
FORMAT = nullQueue
Tags (2)
0 Karma

rturk
Builder

Hi 65pony,

There are quite a few posted questions on the site, and a common suggestion it to confirm the character set (CHARSET) of the file that is being indexed. I had a similar issue with some logs, where a single 27MB archive would chew through an entire 50GB license due to the \x00 values... not ideal.

I got around this by having the following props.conf:

[nice_sqlTrace]
CHARSET         = UTF-16LE
SHOULD_LINEMERGE    = false

So see if you can confirm the CHARSET of the file to be indexed and set that accordingly.

Let me know how you get on 🙂

0 Karma

lukejadamec
Super Champion

Log on to the system that generates this garbage, look at the 'file', and post the redacted contents of the problem event.
If the actual log contains this, then talk to the developers. Otherwise, it is like trying to hack logs.

0 Karma

jonuwz
Influencer

you tried \\x00 ?

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...

Updated Data Management and AWS GDI Inventory in Splunk Observability

We’re making some changes to Data Management and Infrastructure Inventory for AWS. The Data Management page, ...