Splunk Search

Legacy users have issue with how splunk work, How can I help them use Splunk effectively?

indeed_2000
Motivator

Hi
Some users complain about Splunk search. Before Splunk, they simply open the log file and look for issues.

1-As you know log files start from the first line and finish at the last line. While Splunk search reverse newest event show first.

2-another issue is they can’t trace transactions with Splunk easily. because of Splunk results limitation they should set a smaller time range and imagine how hard is when in each second over 1000 transactions occurred.

 

FYI:

Try to use “sort _raw” but it is slow.
Try to use the transaction command but they have unstructured transactions not easy to find them.
Try to remove the limitation but it will be slow.

So they prefer to use log files instead of Splunk.
How can I help them to use Splunk effectively?

Any idea?

Thanks

Labels (6)
Tags (1)
0 Karma

gcusello
Esteemed Legend

Hi @indeed_2000,

sorry but I completely disagree with you.

in one of my project we used Splunk to permit to the software developers of a bank to see all logs from development and production environments during their activity.

This could not seem a great advantage, but I disagree with you because the search features of Splunk are another planet than that the ones you have in "vi" or "nano".

Then using Splunk you have in one point all the logs from many systems, instead using logs you have to chenge server to see logs from different ones.

About the issues you listed, you can easily change the sort in events display using the command "sort -_time",  having in this way the same sort of a log file, but anyway is an approach problem not a real issue.

The main difference is that in a log file you cannot filter events, in Splunk you can!

About transactions tracing, I don't see how you can trace a transaction in a sequencial log: you can only manually do, instead in Splunk you can.

About the speed of the transaction command, I hint to avoid it and to see many answers from me and many others about how to correlate logs using "stats".

There are other interesting features that you can have using Splunk instead direct access to logs: usually developers doesn't access production systems and logs, in Splunk you can do this: as I said, I implemented a project to permit to developers to access all application logs from development and prodtion environments, using only one interface for many systems and very strong features that you don't have in direct access to logs.

Ciao.

Giuseppe

ITWhisperer
SplunkTrust
SplunkTrust

What issues are they trying to find?

Build reports/dashboards that make finding these types of issues faster. It may still be that looking through the log helps them get a better understanding of the issue, but Splunk may help identify which log and where in the log to look. For example, consider multiple logs either fragmented by time or source.

Depending on what is in the log, a Splunk search may be able to pull out all the relevant log entries rather than having thousands of irrelevant information clouding the picture. Admittedly, once you know which id you are looking for, grep will work too (assuming the logs are good enough to tag each entry with the relevant id).

"Depending on what is in the log" may be the key, if the logs are not well structured or the information is disjointed, it may be hard to correlate information, but Splunk may be able to help with that too.

0 Karma

indeed_2000
Motivator

Issues that users look for are different. The main issue is log files are messy and there is no clear id or tag that can use to trace them.

FYI: I’ve create different dashboard/report that help them but still use log files!

 

Any other idea?

Thanks

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Messy logs are a problem whichever way you look at it.

Sometimes, looking at the logs is the easiest way to go.

Why are you trying to get them to use Splunk?

You have to be able to show them a benefit of using Splunk. It doesn't have to be a replacement for looking at logs, the benefit could simply be finding where to look faster.

I used to look at logs all the time; I still do look at them when I need to, but I can be much more focussed on which logs to look at and which timeframe in the logs. This is aided, in part, by setting up alerts so I know when an issue has occurred and only have to look through events when there is something to look for.

Another option is to demonstrate some capability that they would struggle with, perhaps identifying a transaction that took longer than normal and finding the log entries pertaining to that transaction.

You could sit down with them and see how they currently work, and then see if you could have used Splunk to do it faster or more accurately.

Splunk is not just about looking at logs, it is also about deriving additional information from the logs. If you can't get the team who are looking at the logs to use Splunk, find some other users who would benefit from the information you can derive - management tend to love charts and graphs showing how the systems or processes are performing, by whatever measure is helpful.

Get Updates on the Splunk Community!

Build Scalable Security While Moving to Cloud - Guide From Clayton Homes

 Clayton Homes faced the increased challenge of strengthening their security posture as they went through ...

Mission Control | Explore the latest release of Splunk Mission Control (2.3)

We’re happy to announce the release of Mission Control 2.3 which includes several new and exciting features ...

Cloud Platform | Migrating your Splunk Cloud deployment to Python 3.7

Python 2.7, the last release of Python 2, reached End of Life back on January 1, 2020. As part of our larger ...