Getting Data In

How to know and verify if my syntax I write for input.conf in deployment apps work or not?

phamxuantung
Communicator

Hello,

So I have a forwarder installed on a server and it show up on Clients in Forwarder Management.

Then I create new app in depployment-apps, with local/input.conf like this

 

[monitor:///home/cnttm/Vibus/logTransit/application.log]
crcSalt = <SOURCE>
disable = false
index = mynewindex

[monitor:///home/cnttm/Vibus/logTransit/*.log]
crcSalt = <SOURCE>
disable = false
index = mynewindex

[monitor:///home/cnttm/Vibus/logTransit/*]
crcSalt = <SOURCE>
disable = false
index = mynewindex

[monitor:///home/cnttm/*]
crcSalt = <SOURCE>
disable = false
index = mynewindex

 

The log directory is: /home/cnttm/Vibus/logTransit/application.log

Then I create a server classes and apps, enable and restart it.

But when I search index=mynewindex, I dont have any result, and I'm pretty sure we have log in that directory.

Does anyone know anything wrong with my syntax? And how do I know/check if my deployment apps is working or not?

Labels (4)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @phamxuantung,

have the files to monitor always the same file name or it continously changes?

If it has always the same filename, you don't have to use crcSalt statement, that's usually used for files with different names containing almost the same logs.

Then, has the user you're using to run Splunk on the target the grants to access folders and files?

Then, what's the format of the timestamp in the logs ? if it's in european format (dd/mm/yyyy) or americn format (mm/dd/yyyy)

If it's in european format in the first 12 days of the month you could index your logs with a wrong timestamp.

Last check: if you run a cli command:

ls -al /home/cnttm/Vibus/logTransit/application.log

using the user you're using to run splunkforwarder, have you results?

Anyway, answering to your question, the only way is to try, there isn't any other way!

Last thing, you have overriding inputs in your inputs.conf, Splunk read a file once, so use only one of them.

Ciao.

Giuseppe

0 Karma

phamxuantung
Communicator

Sorry for the late reply, @gcusello 

1. The file I want to monitor always have the same filename. I write those duplicate syntax to see if any of those can catch on, and slowly delete them and keep one that work.

2. In our company, the one who install the forwarder is on other department so I don't really know the user or how to run the CLI command.

3. The timestamp format of the log is (yyyy-mm-dd). Example

 

2022-08-30T13:50:01.193+0700 DEBUG Enter process
2022-08-30T13:50:01.205+0700 DEBUG 
<isomsg direction="incoming">
  <field id="0" value="0800"/>
  <field id="7" value="0830065002"/>
  <field id="11" value="102316"/>
  <field id="32" value="971040"/>
  <field id="70" value="301"/>
</isomsg>

 

Does this have anything to do with the getting logs? I thought Splunk can automate the indexing process. If not, should I add to config anything?

Also, I don't define sourcetype in my syntax in input.conf, would that be a problems?

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @phamxuantung,

ok, only for check, copy the log to ingest in a new file with a different name, and see if it's indexed.

If yes, choose one of the input stanzas and delete the crcSalt row.

One question. the fielname is always the same, but the content (especially the first 256 chars) are always the same or theyt change?

I suppose that they should change because the timestamp should change.

About the user, you have to check this point because it's a relevant one.

About your questions:

  • Does this have anything to do with the getting logs?

    • no because the time format doesn't give format misunderstanding,
  • I thought Splunk can automate the indexing process. If not, should I add to config anything?

    • the indexing proces is automated but you have to copnfigure it: inputs, parsing, etc...
    • The new Data Strema Processor is coming but it isn't still present.
  •  

    Also, I don't define sourcetype in my syntax in input.conf, would that be a problems?

    • Splunk usually identifies the sourcetype, but I always prefer to force it in my inputs.conf.
    • remember that, with few exceptions, the parsing phase is done (with props.conf and transforms.conf) on indexers or (when present) on Heavy Forwarders, this means that you don't need to putt props.conf and transfroms.conf in the UFs but on Indexers and Heavy Forwarders.

Ciao.

Giuseppe

0 Karma

manjunathmeti
Champion

You need to reload the deployment server to push the add-on.

splunk reload deploy-server

 

0 Karma
Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...