Hi,
I am new to Splunk and I'm trying to configure the Syslog for Sourcefire Defense Center. I am using the latest version of Splunk Light (installed on Windows 7 64 bit) and the latest Defense Center. I have configured the Defense Center to send Syslogs on TCP 514. I have configured the data input as "syslog" and "TCP 514", but I am unable to see the Syslogs on Splunk search.
I ran a wireshark on the Windows 7 on which Splunk is installed, and I confirm that the Syslogs are being captured. I must be missing some configuration on the Splunk. Can you please advise?
Thank you
There's a list of things to try here. A long, long list so I apologize for the length. Please be careful as you go through, perhaps printing it out and checking things off as you test them. You'll have to google some of the pieces, too, using your own environment's information. If you continue to have trouble after going through this, please list which all you tried and worked fine, and where it finally went wrong.
BTW, a few of these steps are skipped (see first paragraph below) and in the later ones some are repeats of things that have been tried. Please just try then again for the sake of completeness.
First, you have already confirmed the packets are making it to your server via tcpdump/wireshark. Great, that would be step one and knowing that removes the entire "Have you configured your SourceFire Defense Center properly" question.
Now, on the OS side.
Second, absolutely double-check your firewall is turned off. It really only needs the right exception (port 514 TCP and UDP), but simply turning it off will work fine. To confirm, please find a third machine (I'll assume running Windows since that's what your Splunk box is running - modify as appropriate if you can only get your hands on a *nix box of a sort) and from that third, extra machine open a command prompt (As Administrator if you have UAC still on) and in there type
telnet 20.20.20.50 514
If you get a "Could not open connection to the host ..." then you simply don't have anything listening on 514 or it is firewalled. This and any other error condition must be corrected before anything farther down the chain will work. As long as you get nothing but a blinky cursor that "goes away" as soon as you try typing something, then you are likely good here.
Third, now that we've confirmed you absolutely and unequivocally have something actually listening on 514 and that there's no firewall blocking communication, we need to confirm your inputs. On your Splunk box, open a command prompt and type
cd \program files\splunk\bin
splunk cmd btool inputs list --debug | clip
Then open notepad and click Edit/Paste. That should drop a whole lotta "stuff" into notepad. Page down through there until in the right column you see the input you have set up for Splunk UDP 514 (or search a few times for "514" and you'll find it at some point). Look at it. See if it makes sense. You can paste that portion into a comment here and we can take a look if it doesn't make enough sense to you. Here's a little help in using btool. Below I've pasted the bits I have in a temporary UDP input on 5514 on my *nix based Splunk server. Yours will be similar but will have different paths and stuff.
/opt/splunk/etc/apps/search/local/inputs.conf [udp://5514]
/opt/splunk/etc/system/default/inputs.conf _rcvbuf = 1572864
/opt/splunk/etc/apps/search/local/inputs.conf connection_host = ip
/opt/splunk/etc/system/local/inputs.conf host = splunk-test
/opt/splunk/etc/apps/search/local/inputs.conf index = main
/opt/splunk/etc/apps/search/local/inputs.conf sourcetype = cisco_syslog
You'll see in there that it specifies the index it's going to and the sourcetype. It also tells you where to find the file that set that particular settings. You can see for my test I just created the input in the context "search" (look on the left in the path). But that some settings I didn't set there and are being picked up from system default (default settings) or system local (think of them as my local environment overrides to the default themselves.)
Lastly, assuming you have successes all along to this point, you can use the information from above to craft a search of the index where this data is actually going. Make sure you are logged in as admin so that you should have access to all indexes, but a search like index=main
in my case over all time should pull up events.
Again, if you follow the above until you get something that "doesn't look right", that will help a lot in narrowing down where thing are going wrong.
There's a list of things to try here. A long, long list so I apologize for the length. Please be careful as you go through, perhaps printing it out and checking things off as you test them. You'll have to google some of the pieces, too, using your own environment's information. If you continue to have trouble after going through this, please list which all you tried and worked fine, and where it finally went wrong.
BTW, a few of these steps are skipped (see first paragraph below) and in the later ones some are repeats of things that have been tried. Please just try then again for the sake of completeness.
First, you have already confirmed the packets are making it to your server via tcpdump/wireshark. Great, that would be step one and knowing that removes the entire "Have you configured your SourceFire Defense Center properly" question.
Now, on the OS side.
Second, absolutely double-check your firewall is turned off. It really only needs the right exception (port 514 TCP and UDP), but simply turning it off will work fine. To confirm, please find a third machine (I'll assume running Windows since that's what your Splunk box is running - modify as appropriate if you can only get your hands on a *nix box of a sort) and from that third, extra machine open a command prompt (As Administrator if you have UAC still on) and in there type
telnet 20.20.20.50 514
If you get a "Could not open connection to the host ..." then you simply don't have anything listening on 514 or it is firewalled. This and any other error condition must be corrected before anything farther down the chain will work. As long as you get nothing but a blinky cursor that "goes away" as soon as you try typing something, then you are likely good here.
Third, now that we've confirmed you absolutely and unequivocally have something actually listening on 514 and that there's no firewall blocking communication, we need to confirm your inputs. On your Splunk box, open a command prompt and type
cd \program files\splunk\bin
splunk cmd btool inputs list --debug | clip
Then open notepad and click Edit/Paste. That should drop a whole lotta "stuff" into notepad. Page down through there until in the right column you see the input you have set up for Splunk UDP 514 (or search a few times for "514" and you'll find it at some point). Look at it. See if it makes sense. You can paste that portion into a comment here and we can take a look if it doesn't make enough sense to you. Here's a little help in using btool. Below I've pasted the bits I have in a temporary UDP input on 5514 on my *nix based Splunk server. Yours will be similar but will have different paths and stuff.
/opt/splunk/etc/apps/search/local/inputs.conf [udp://5514]
/opt/splunk/etc/system/default/inputs.conf _rcvbuf = 1572864
/opt/splunk/etc/apps/search/local/inputs.conf connection_host = ip
/opt/splunk/etc/system/local/inputs.conf host = splunk-test
/opt/splunk/etc/apps/search/local/inputs.conf index = main
/opt/splunk/etc/apps/search/local/inputs.conf sourcetype = cisco_syslog
You'll see in there that it specifies the index it's going to and the sourcetype. It also tells you where to find the file that set that particular settings. You can see for my test I just created the input in the context "search" (look on the left in the path). But that some settings I didn't set there and are being picked up from system default (default settings) or system local (think of them as my local environment overrides to the default themselves.)
Lastly, assuming you have successes all along to this point, you can use the information from above to craft a search of the index where this data is actually going. Make sure you are logged in as admin so that you should have access to all indexes, but a search like index=main
in my case over all time should pull up events.
Again, if you follow the above until you get something that "doesn't look right", that will help a lot in narrowing down where thing are going wrong.
Hi, many thanks for this detailed answer, I have come to Step 3, but the command you have stated:
splunk cmd btool inputs list --debug | clip
does not place antything onto my clipboard, so I have nothing to paste it on my notepad, tried this several times
Sorry, I missed that update. Not sure why that wouldn't work.
For future searchers, perhaps running your command prompt as Administrator (or vice versa - NOT running it as Administrator?) would resolve that. In you had to, you could simply run the command and pipe it to "more" which you'll have to take care to read each screen before pressing a key to continue but would work also.
splunk cmd btool inputs list --debug | more
Glad you found your data!
Wow, very funnily, when I tried Step 2, I was able to see the Log on Splunk when I set the search to index=main
"vvfff" was the keys I typed after the telnet connection went through
Okay, it works I set to UDP 514 and Cisco_Syslog, thanks a lot!!!
I haven't received any answers. Can a splunk expert please provide solutions? Thank you!
Are you 100% sure no other service is bound to UDP port 514? Say you have Kiwi syslog server or something similar also installed on the server.
Hi mikaelbje, yes, I tried the netstat command and found that only Splunk was tied to UDP 514.
Bumping this
Again bumping this, receiving no answers
Check your Windows Firewall 🙂
And why is your Splunk server constantly pinging the Defense Center?
That was the first thing I did before raising this case here. 🙂 Turn off Windows Firewall.
I am not sure why Splunk is doing that.
Can you confirm with a wide-open all-time search that they're not in there? From that point, you can drill down and find them if they're anywhere... Something like :
index=*
Run over all time. Then perhaps start digging into the host fields looking for your defense center IP. The events could be time-stamped incorrectly and coming in in the future or past, or more likely they're just going into an index you aren't searching by default.
Once we get past this and confirm if they are anywhere in Splunk we can likely sort out the rest pretty easily.
Hi,
Thanks for your reply. The aforementioned query index=* yields no result.
Hi, any luck?
Try checking the _internal index for "syslog" and your input "TCP 514", that should be able to tell you if the data it getting stopped before it reaches the splunk process (since you'll find no record of items coming in), or if there are some internal configuration or parsing issues stopping the data from being fully indexed. Also if you know the name of the host and/or IP the syslog is coming from throw those in a query to _internal just incase the first two searches yield nothing.
HI, can you please help?
Okay, so the interal index shows some logs pertaining to the Splunk system, but, it does not show any syslogs from the host.
So, over here, I have uploaded the image of the packet capture here: (20.20.20.12 is the defence center and 20.20.20.50 is the splunk server)
I have shown the query here which I have inputed -> it shows no results
Can you try testing this syslog source using a UDP input on port 514 vs a TCP input? Since you're getting nothing regarding the host in Splunk, it means it's probably not hitting the input queue at all and is getting stopped in the socket layer somewhere.
Also try doing a search in the internal index for "syslog" and "UDP" incase it is using a different hostname for your device (since there are some default syslog transforms) and if there are any errors messages regarding the protocols/inputs themselves.
Also please try using a different port for the TCP syslog apart from 514 (try using one of the unused 4 digit ports and verify it's opened up between both devices fully). I have this odd feeling that TCP 514 is reserved for something, which may be causing issues for the proper handling of this traffic at the OS level.