Splunk Search

need help in field extraction

pench2k19
Explorer

Hi Guys ,

I would like to extract the values that are highlited below into different fields. Can you please help me with the best way other than using the .conf files.

PS: the follwoing text getting logged as single event in splunk by default.

\x00\x00jS\x00\x00\x00\x00\x00\x00**2019-03-07**\x00\x00\x00**hoganids**   \x00\x00\x00**sanitized**
\x00\x00\x00**dda_masterb**\x00\x00\x00**/apps/dat/aasconap/prod/mfs/mfs_8way/cnapp/cnapp_src/cnapp_src_hoganids/main/./dda_master_PG54.dat&**\x00\x00\x00**consumer_hoganids_sanitized.dda_master**\x00\x00\x00**amf_5_cf.dat2019-03-08 03:11:32.9940612019-03-08 03:16:42.693043=**\x00\x00\x00**warning - 35% data volume threshold reached, expected 2639757**\x00\x**00jS**\x00\x00\x00\x00\x00\x00**2019-03-07**\x00\x00\x00**hoganids**    \x00\x00\x00**sanitized**
\x00\x00\x00**dda_masterb**\x00\x00\x00**/apps/dat/aasconap/prod/mfs/mfs_8way/cnapp/cnapp_src/cnapp_src_hoganids/main/./dda_master_PG54.dat**&\x00\x00\x00**consumer_hoganids_sanitized.dda_master**\x00\x00\x00**amf_5_cf.dat2019-03-08 03:11:32.9940612019-03-08 03:16:42.693043**\x00\x00\x00**success**\x00\x00**jS**\x00\x00\x00\x00\x00\x00**2019-03-07**\x00\x00\x00**hoganids**  \x00\x00\x00**conformed**
\x00\x00\x00**dep_dmnd_acctb**\x00\x00\x00**/apps/dat/aasconap/prod/mfs/mfs_8way/cnapp/cnapp_src/cnapp_src_hoganids/main/./dda_master_PG54.dat1**\x00\x00\x00**consumer_servicingaccount_conformed.dep_dmnd_acct**\x00\x00\x00**amf_5_cf.dat2019-03-08 03:11:32.9940612019-03-08 03:16:42.693043=**\x00\x00\x00**warning - 35% data volume threshold reached, expected 2639757**

@jkat54 @vnravikumar

0 Karma

harsmarvania57
Ultra Champion

As @FrankVI mentioned that it looks like encoding issue, if you know what type of encoding or character set is present in your file in that case you can set CHARSET parameter in props.conf, have a look at document https://docs.splunk.com/Documentation/Splunk/7.2.4/Data/Configurecharactersetencoding

Splunk software attempts to apply UTF-8 encoding to your sources by default. If a source does not use UTF-8 encoding or is a non-ASCII file, Splunk software tries to convert data from the source to UTF-8 encoding unless you specify a character set to use by setting the CHARSET key in props.conf.

0 Karma

FrankVl
Ultra Champion

Looks like you have some encoding issues, I'd suggest getting that fixed first (probably the encoding used to ingest this data does not match the actual encoding of the data).

And it is also completely unclear to me what parts you actually want to extract, can you please mark that more clearly?

0 Karma

richgalloway
SplunkTrust
SplunkTrust

What fields do you want extracted? There is nothing "highlighted" in your question.
Do you want to extract them at index time or search time?

---
If this reply helps you, Karma would be appreciated.
0 Karma

pench2k19
Explorer

Please find the updated event here

\x00\x00jS\x00\x00\x00\x00\x00\x00**2019-03-07**\x00\x00\x00**hoganids** \x00\x00\x00**sanitized**
\x00\x00\x00**dda_masterb**\x00\x00\x00**/apps/dat/aasconap/prod/mfs/mfs_8way/cnapp/cnapp_src/cnapp_src_hoganids/main/./dda_master_PG54.dat&**\x00\x00\x00**consumer_hoganids_sanitized.dda_master \x00\x00\x00amf_5_cf.dat2019-03-08 03:11:32.9940612019-03-08 03:16:42.693043=**\x00\x00\x00**warning - 35% data volume threshold reached, expected 2639757**\x00\x00jS\x00\x00\x00\x00\x00\x00**2019-03-07**\x00\x00\x00**hoganids** \x00\x00\x00**sanitized**
\x00\x00\x00**dda_masterb\x00\x00\x00/apps/dat/aasconap/prod/mfs/mfs_8way/cnapp/cnapp_src/cnapp_src_hoganids/main/./dda_master_PG54.dat&**\x00\x00\x00**consumer_hoganids_sanitized.dda_master \x00\x00\x00amf_5_cf.dat2019-03-08 03:11:32.9940612019-03-08 03:16:42.693043**\x00\x00\x00**success**
0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...