Getting Data In

Apache Access Log Extract Byte Size

doprocess
Engager

I have tons of log lines coming from the Apache access log that look something like this:

11/19/19 1:39:01.000 PM 192.168.1.1 - - [19/Nov/2019:13:39:01 -0500] "GET /jquery/jquery-ui.min.js HTTP/1.1" 200 240427
11/19/19 1:38:36.000 PM 192.168.1.1 - - [19/Nov/2019:13:38:36 -0500] "GET /21ab251b74e7f7c26.js HTTP/1.1" 200 11331980

The last number on each line represents the size of the file in bytes.
I need to extract that last number and chart it, to find the total bandwidth going through my server.
How do I go about this?

0 Karma
1 Solution

doprocess
Engager

Mayurr98, that won't work, size is not defined.

After a bit of testing, I've ended up creating my own query, that takes everything after the last empty space (after 9th character we have a size in bytes) and used it like this to get the bandwidth that is used by .js or .css files, plotting it on 1-hour based timechart:

GET 200 (.js OR .css)
| rex ^(\S*\s){9}(?<byte_size>.*)$
| bucket _time span=1h
| timechart sum(byte_size) as "Bandwidth (GB)"
| eval "Bandwidth (GB)"=round('Bandwidth (GB)'/1024/1024/1024/6,2)

This works like a charm, I hope someone finds this useful.

View solution in original post

0 Karma

doprocess
Engager

Mayurr98, that won't work, size is not defined.

After a bit of testing, I've ended up creating my own query, that takes everything after the last empty space (after 9th character we have a size in bytes) and used it like this to get the bandwidth that is used by .js or .css files, plotting it on 1-hour based timechart:

GET 200 (.js OR .css)
| rex ^(\S*\s){9}(?<byte_size>.*)$
| bucket _time span=1h
| timechart sum(byte_size) as "Bandwidth (GB)"
| eval "Bandwidth (GB)"=round('Bandwidth (GB)'/1024/1024/1024/6,2)

This works like a charm, I hope someone finds this useful.

0 Karma

mayurr98
Super Champion

hi to extract it try this:

| rex "(?<size>\d+$)"

to calculate the total size by server try this:

  | rex "(?<size>\d+$)" | stats sum(size) as size_in_bytes by server
0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Get the T-shirt to Prove You Survived Splunk University Bootcamp

As if Splunk University, in Las Vegas, in-person, with three days of bootcamps and labs weren’t enough, now ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...