Getting Data In

How come event breaker based on timestamp works when uploading a file but not with source type props.conf?

0xlc
Path Finder

Hi guys,

I am trying to index a ProxySQL log file which looks like:

ProxySQL LOG QUERY: thread_id="25" username="blabla" schemaname=information_schema" client="10.206.119.24:62462" HID=1 
server="backendserver.example:3306" starttime="2019-01-24 14:13:42.436497" endtime="2019-01-24 14:13:42.446705" 
duration=10208us digest="0x3C740A905F66E34A"
SELECT  * from example
ProxySQL LOG QUERY: thread_id="25" username="blabla" schemaname=information_schema" client="10.206.119.24:62462" HID=1 
server="backendserver.example:3306" starttime="2019-01-24 14:13:42.436497" endtime="2019-01-24 14:13:42.446705" 
duration=10208us digest="0x3C740A905F66E34A"
SELECT  @@port

When i tried to add this log using add data, i selected event_breaker auto and time_prefix with "stattime", and everything was perfect

Then i tried it with props.conf:

[proxysql]
TIME_PREFIX = starttime
SHOULD_LINEMERGE = true
EVENT_BREAKER_ENABLED = true

In inputs.conf, I got sourcetype = proxysql

And it does not work also. Is it possible that even if I set it up properly the new settings won't apply to logs already indexed? Because i tried various combinations but they don't change. (the log file is static for now; it's not getting any data in so it's already been indexed)

Thanks!

0 Karma

0xlc
Path Finder

i tried with a new log file and it works and i can extract all fields except the queries which is always the last lines,

basically all the queries are in a new line and splitted in multiple lines like:

ProxySQL LOG QUERY: thread_id="43" username="redacted" schemaname=information_schema" client=redacted:51827" HID=1 server="redacted:3306" starttime="2019-01-24 10:37:20.959324" endtime="2019-01-24 10:37:21. 47135" duration=87811us digest="0x8D9F0318EE412645"
select date_format(t.redacted,'%d-%m-%Y') date, t.redacted,t.redacted, c.redacted, c.description, CASE when t.redacted = 'redacted' 
then redacted else ppcustom_field end Ref, t.redacted, t.net_amount, t.redacted
from redacted.redacted c, redacted.redacted t
where c.redacted = t.redacted
-- and t.redacted is not null
and t.redacted >= '2019-01-23 00:00:01' and t.redacted < '2019-01-24 00:00:01'
order by 2,1,4,6,5

i don't know how to write a regex to extract that, i can't do ^select because it could be a query update. what i am sure is the query is always after the digest field.

0 Karma

dkeck
Influencer

You can not change already ingested logs when it comes to breaking them.

0 Karma

0xlc
Path Finder

i was expecting that.

at the end, i save the configuration of props made by the web upload data and overwritting proxysql sourcettype, then i just copy and paste from the searchhead to the master node in props.conf

tomorrow i got new logs to check if it works

thanks

0 Karma

dkeck
Influencer

you can check if you indexers picked up the config with btool

./splunk cmd btool props list --debug

you could add ,| grep to grep for your app name to lower the output of btool

0 Karma
Get Updates on the Splunk Community!

Strengthen Your Future: A Look Back at Splunk 10 Innovations and .conf25 Highlights!

The Big One: Splunk 10 is Here!  The moment many of you have been waiting for has arrived! We are thrilled to ...

Now Offering the AI Assistant Usage Dashboard in Cloud Monitoring Console

Today, we’re excited to announce the release of a brand new AI assistant usage dashboard in Cloud Monitoring ...

Stay Connected: Your Guide to October Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...