Splunk Search

How to use timechart to display failed user login info, sorted by time/date of each login attempt?

rkris
Explorer

I'm trying to display failed user login information by using a timechart but I'm not sure how to show the time and date of the logins for each of the user

This is my code :

source="General-linux-sql.log" AND sourcetype="Linux" AND "Failure Audit" AND "Logon "
| rex "User Name\: (?<User_Name>[^\s]+)"
| timechart count by User_Name

This is the output that I get. Also, how do I change it so that all the users are separated?

paste_qns.PNG

Labels (2)
0 Karma

rkris
Explorer

How do I display the time for each of the logins as well?

0 Karma

anilchaithu
Builder

@rkris 

you can do the following to see users

  • change area chart to line chart (OR) column chart
  • If you choose line chart, Format -> General -> Multi series mode -> yes

you can play with both chart type & format options to improve look and feel.

Hope this helps

rkris
Explorer

@anilchaithu 

Thanks for the solution! However, now I have another problem with the Y-axis title. The title is now unreadable even though I changed it. Do you have any fix for this?

paste_qns1.PNG

paste_qns2.PNG

0 Karma

isoutamo
SplunkTrust
SplunkTrust

Have you tried trellis as visualization? It should work if you have less than 20 user and if more then you must divide those to  group of 20.

r. Ismo

0 Karma

rkris
Explorer
 
0 Karma
Get Updates on the Splunk Community!

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer at Splunk .conf24 ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...

Share Your Ideas & Meet the Lantern team at .Conf! Plus All of This Month’s New ...

Splunk Lantern is Splunk’s customer success center that provides advice from Splunk experts on valuable data ...