Getting Data In

Manually adding static file into paloalto index with Splunk_TA_paloalto

calvinmcelroy
Path Finder

We had a problem with our syslog server and a bunch of data went missing in the ingest. The problem was actually caused by the UF not being able to keep up with the volume of logs before the logrotate process compressed the files, making them unreadable. I caught this in progress and began making copies of the local files so that they would not get rotated off the disk. I am looking for a way to put them back into the index in the correct place in _time. I thought it would be easy but it is turning out harder than I expected. 

I have tried making a Monitor inputs for a local file, and cat/printf the log file into the monitored file. I have also tried to use the "add oneshot" cli command, neither way has gotten me what I am wanting. The monitored file kind of works, and I think I could probably make it better given some tweeking. 

The "add oneshot" command actually works very well and it is the first time I am learning about this useful command. My problem I believe is that the sourcetype I am using is not working as intended. I can get data into the index using the oneshot command and it looks good, as far as breaks the lines into events, etc. The problem I am seeing is all the parsing rules that are included with the props/transforms in the Splunk_TA_paloalto addon are not being applied effectively. Splunk is parsing some fields but I suspect it is guessing based on the format of the data.

When I look at the props.conf for the TA, I see it uses a general stanza called [pan_log] but inside the config will transform the sourcetype into a variety of different sourcetypes based on the type of log in the file (there is at least 6 possibilities).

 

TRANSFORMS-sourcetype = pan_threat, pan_traffic, pan_system, pan_config, pan_hipmatch, pan_correlation, pan_userid, pan_globalprotect, pan_decryption

 


When I use the oneshot command, the data goes into the index and I can find it by specifying the source, but none of this transforms is happening, so the logs are not separated into the final sourcetypes. 

Has anybody ran into a problem like this and know a way to make it work? Or have any other tips that I can try to make some progress on this?
One thing I was thinking is that the Splunk_TA_paloalto addon is located on the indexers, but not on the server that has the files that I am doing the oneshot comamnd from. I expected this would all be happening on the indexer tier, but maybe I need to add it locally so splunk knows how to handle the data. 

Any ideas?

0 Karma

tscroggins
Influencer

Hi @calvinmcelroy,

Splunk can read LZW, gzip, bzip2, or any other compression format that supports streaming via stdin/stdout if properly configured, so I'm surprised you had problems with logrotate. Is your configuration outside the norm?

If you're running oneshot from a host with Splunk Enterprise installed, i.e. a heavy forwarder, then yes, you should have Palo Alto Networks Add-on for Splunk installed on the server.

0 Karma
Get Updates on the Splunk Community!

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...

Explore the Latest Educational Offerings from Splunk (November Releases)

At Splunk Education, we are committed to providing a robust learning experience for all users, regardless of ...