I've deployed a honeypot on my Raspberry PI with Ubuntu utilising Cowrie and I now have some log files I would like to send over to Splunk Enterprise.
I'm having some problems in successfully receiving and accessing the logs on the Splunk Enterprise web interface.
Currently my inputs.conf and outputs.conf files look like this and I'm unsure to as if I'm missing any stanzas or fields required.
inputs:conf
[monitor:///home/ubuntu/cowrie/var/lib/cowrie/tty]
disabled = 0
ouputs.conf
[tcpout]
defaultGroup=my_indexers
# new stanza
[tcpout:my_indexers]
server='my ip':9997
Am I missing any host fields and how do I go about looking for the logs on the Splunk Enterprise section, I am accessing the Splunk Enterprise web interface by entering 'localhost:8000.....'
I'm under the impression I might of missed a step somehow, I downloaded and extracted the Universal Forwarder then made a splunk user account, started splunk, edited the conf files... is that all that is required or do I need to edit settings on the enterprise side?
All I'm seeing when I enter 'index=main' in the search field is only 1 log file when multiple exist in the directory, is my syntax in the .conf file incorrect?
Hi @JP1998,
If you see one file in your search it proves that config are ok. Your problem maybe the identical files. If the first 256 bytes of the files are the same, Splunk thinks it already indexed that. You can try adding below in your inputs.conf;
[monitor:///home/ubuntu/cowrie/var/lib/cowrie/tty]
disabled = 0
crcSalt = <SOURCE>