Splunk Search

field extractor help

a212830
Champion

Hi,

I need to extract some fields via field extractor. I got most of them, but it is ignoring the ones that have decimal places. I want to grab the 5th field. The extract is as follows:

(?i)|.*?|(?P\d+)(?=|)

Here's some sample data:

375839600000|330128|NormalizedCPUInfo|Utilization|2|CPU|deviceA|HP H3C CPU

1375839600000|330571|NormalizedCPUInfo|Utilization|10|CPU|DEVICEB|HP H3C CPU

1375839600000|355140|NormalizedMemoryInfo|Utilization|24.331592003281088|Memory|ANOTHERDEVICE|Enhanced-MemoryPool: Processor 6033.1

1375839600000|355140|NormalizedMemoryInfo|Free|166252960|Memory|YETANOTHERDEVICE|Enhanced-MemoryPool: Processor 6033.1

1375839600000|355140|NormalizedMemoryInfo|LargestMemoryFree|165506776|Memory|LABDEVICE|Enhanced-MemoryPool: Processor 6033.1

Can anyone help?

Tags (1)
0 Karma

kristian_kolb
Ultra Champion

I would recommend to use the props/transforms approach using DELIMS and FIELDS. It's a little bit easier to handle stuff correctly. And you get all the fields extracted in one go.

props.conf

[your_sourcetype]
REPORT-blah = pipe_separated

transforms.conf

[pipe_separated]
DELIMS = "|"
FIELDS = field1, field2, field3, field4, field5, field6, field6, field7, field8

/Kristian

grijhwani
Motivator

Good solution to the broader unasked question.

0 Karma

grijhwani
Motivator

The answer is very simple: "d+" only allows for one or more digits, and a decimal point is not a digit.

a212830
Champion

Thanks. Caught it! Duh.

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...