Getting Data In

Only 1 forwarder showing logs in Month, Day, Year format. I want default of Day, Month, Year.

sarlacc
Explorer

I'm running Splunk Enterprise 9.1.1.  It is a relatively fresh installation (done this year).  Splunk forwarders are also using version 9.1.1 of the agent.

The indexer is also the deployment server.  Beyond that, I only have forwarders forwarding to it. 

I have one Linux host (Redhat 8.9) with this problem.  I've deployed Splunk_TA_nix and enabled rlog.sh to show info from /var/log/audit/audit.log.

Using today as an example (06/05/2024), I don't see entries for 06/05/2024.  But I do see logs from today under 05/06/2024.

Example from the splunk search page:

index="linux_hosts" host=bad_host          (last 30 days)

05/06/2024 at left side of events     audit data...........(06/05/2024 14:32:12) audit data.........

As I mentioned above, I have one deployment server.   All forwarders are using the same/centralized.   Small environment, I'd say ~25 linux hosts (redhat 7 and 8).  This is the only Redhat 8 with this problem.

Tried reinstalling splunk forwarder (completely deleted /path/to/splunkforwarder) once I uninstalled it.

I knowa little about using props.conf with TIME_FORMAT and have not done so.  My logic is if I needed it, I'd see this on all forwarders not just the one i have with this problem.

I did localectl and it shows en_US.  ausearch -i (same thing rlog.sh does) shows the dates/times as I'd expect.  Anything else I should look for  from the OS perspective?  Any suggestions on what I could do from splunk?  Also, noticed that when I go to the _internal index, dates/times are consistent.  When I use my normal index (linux_hosts) this is my one RH8 that has this problem.  Other Redhat 8 are what I'd expect.

A side note here: someone else suspected this host wasn't logging.  So they did a manual import of the audit.log files.  Mind you, the dates in the file were not parsed since they didn't go through rlog.sh (ausearch -i) first.  Could this also be part of the problem?  If so, how can I undo what was done?

 

Thanks!

Labels (3)
0 Karma
1 Solution

sarlacc
Explorer

I resolved this by rebuilding the operating system of the affected host.

It previously had a Splunk 6.x forwarder and I uninstalled the old version before I installed the new one.

I also did a rm -rf on the old forwarder path.

I had thought removal of the /opt/path_where_forwarder_was_installed would be enough.

Are their other files/directories that I should also include in my uninstall script?

View solution in original post

0 Karma

sarlacc
Explorer

After later review,  the rebuild of the host did not fix the problem.  In fact, this was more widespread of a problem than I had indicated.  I'd say 3-5% of my hosts were seeing this problem.

I found the releasenotes for the latest version of the Splunk_TA_nix addon.  Once I downloaded/installed this version of the addon, new records starting appearing.

https://docs.splunk.com/Documentation/AddOns/released/UnixLinux/Releasenotes

 

 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @sarlacc ,

good for you, see next time!

Please acceptyour last message to help other people of Coomunity to find the right solution.

Ciao and happy splunking

Giuseppe

P.S.: Karma Points are appreciated 😉

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @sarlacc ,

at first it isn't a good idea to have the Deployment server on Indexers or Search Heads, have you another server?

You can use a shared (with other roles) server only if the DS has to manafe up to 50 clients, more it requires a dedicated server.

About data, at first check the TIME_FORMAT: what's the format of your date: european (dd/mm/yyyy) or american (mm/dd/yyyy), by default Splunk uses the american format.

About the only Universal Forwarder with issues, have you internal Splunk logs (_* indexes)?

if yes, it's an issue of that data source, if not there's a connection issue.

Ciao.

Giuseppe

0 Karma

sarlacc
Explorer

Hi gcusello ,

My indexer hosts less than 50 hosts.  Environment is rather small.

Format of the date is American (mm/dd/yyyy).  I can also see that when I do an ausearch -i from the audit.log.  The date is the format I'd expect as indicated above.

In terms of the _internal index, dates are correct.  Yesterday's _internal index showed 06/05/2024 and the logs themselves were the same.  I'm not sure I'm answering your question, however.  Let me know if you needed other information about _*indexes.

I will try setting props.conf on this one particular host and reply back with the results tomorrow.

 

0 Karma

sarlacc
Explorer

I had made the cahnge to props.conf but I'm not seeing anything show up today (06/07/24).  06/06/24 logs were fine as I'd expect.

I'd like to make sure I did the props.conf right:

I put this under etc/system/local/props.conf

[auditd] {is this the right text here?  I thought I had read this should be the sourcetype.  I got the sourcetype from inputs.conf of rlog.sh under the Splunk_TA_nis app.}

TIME_FORMAT=%m/%d/%y %T:%3N {I did this to mimic what I see when I do a search from the splunk webserver and this is what shows up at the left before the actual log entry}

0 Karma

sarlacc
Explorer

I resolved this by rebuilding the operating system of the affected host.

It previously had a Splunk 6.x forwarder and I uninstalled the old version before I installed the new one.

I also did a rm -rf on the old forwarder path.

I had thought removal of the /opt/path_where_forwarder_was_installed would be enough.

Are their other files/directories that I should also include in my uninstall script?

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...