I have an annoying log that I am trying to extract data from and I am lost and don't know where to go from here. What I am trying to extract is as follows 2020-10-02 17:01:32,360 INFO: User...
See more...
I have an annoying log that I am trying to extract data from and I am lost and don't know where to go from here. What I am trying to extract is as follows 2020-10-02 17:01:32,360 INFO: User.val (value, value2, value3, value4): User not found. Parameters: userId: 1; requester: userVO: userId: 66666 status: V username: joe.blogs@someplace.com authenticationMethod: PASSWORD emailAddress: joe.blogs@someplace.com firstName: Joe middleName: lastName: Bloggs displayName: Joe Blogs createdBy: 123456 dateCreated: 2019-07-02 17:17:29.68 lastUpdatedBy: 66666 dateLastUpdated: 2020-07-20 16:49:30.409 signupCompletedDate: 2019-07-03 14:24:52.389 lastSignInDate: 2020-10-01 19:04:21.787 title: Person company: Somewhere addressLine1: 1 This Street addressLine2: city: Somewhere state: ST1 zipCode: 1234 country: ThatCountry workPhoneNumber: homePhoneNumber: +001122334455 mobilePhoneNumber: otherPhoneNumber: faxNumber: secretQuestions: [] signInLocked: false signInFailureCount: 0 signInTotalFailureCount: 0 signInLastFailureDate: <null> resetPasswordFailureCount: 0 resetPasswordTotalFailureCount: 0 resetPasswordLastFailureDate: <null> recipientInclusionList: recipientExclusionList: allowSMTPInput: false lastPasswordResetDate: 2019-08-20 15:06:00.856 passwordExpires: true forcePasswordReset: false externalUser: false lastSignInUserName: joe.blogs@someplace.com lastSignInDomain: activationCode: expiryDate: <null> expiredOn: <null> lastActivityDate: 2020-10-01 19:07:12.088 autoUnlockCount: 0 manualUnlockRequired: false selfRegIPAddress: 192.168.0.1 senderRoleExpired: false externalUser: false channelType: Web ipAddress: 10.1.1.1 The first line is the current date (i.e. 2020-10-02 17:01:32,360 INFO: ) and this would used for my indexed time. Between this user event and the next user event, the log is interspersed with the following garbage 2020-10-02 16:59:36,409 ERROR: Mail.send(): (Task ID: x4) Error while sending message: javax.mail.SendFailedException: Invalid Addresses; nested exception is: com.sun.mail.smtp.SMTPAddressFailedException: 501 5.1.3 Invalid address javax.mail.SendFailedException: Invalid Addresses; nested exception is: com.sun.mail.smtp.SMTPAddressFailedException: 501 5.1.3 Invalid address at com.sun.mail.rcptTo(SMTPTransport.java:1862) at com.sun.mail.sendMessage(SMTPTransport.java:1118) at com.neesh.util.Mail.send(Unknown Source) at com.neesh.fds.util.EmailHelper.sendEmail(Unknown Source) at com.neesh.fds.core.MailSenderProcess.sendEmail(Unknown Source) at com.neesh.fds.core.MailSenderProcess.executeHelper(Unknown Source) at com.neesh.fds.core.AbstractFDSProcess.execute(Unknown Source) at com.neesh.fds.core.AbstractFDSProcess.startup(Unknown Source) at com.neesh.fds.core.MailSenderProcess.startup(Unknown Source) at com.neesh.fds.core.FDSProcessThread.run(Unknown Source) Caused by: com.sun.mail.smtp.SMTPAddressFailedException: 501 5.1.3 Invalid address at com.sun.mail.smtp.SMTPTransport.rcptTo(SMTPTransport.java:1715) ... 9 more 2020-10-02 16:59:36,409 WARN: Mail.send(): (Task ID: x4) Exiting send() with error code: -2 2020-10-02 16:59:36,409 ERROR: MailSenderProcess.executeHelper(): Invalid Addresses I started with adding data in and then using the Advanced configuration to try and break this up starting with BREAK_ONLY_BEFORE_DATE set as true and this starts to break the log but then (as expected) breaks at every date. So the log then breaks up at every field that has a date (e.g. lastSignInDate, dateCreated, etc.). The problem here is that the timestamp then gets impacted as it will read the time properly and my indexing for that specific break with be all over the place instead of the first time (i.e. 2020-10-02 17:01:32) What I would like to do is capture everything between "2020-10-02 17:01:32,360 INFO:" and "ipAddress: 10.1.1.1" (using the example above). The log is a rolling log so it is constantly being written to. I would also like to get rid of the garbage but have not tried doing NULLs to remove events before ingest. There is no recognised sourcetype nor does the product have any TA's in SPLUNK Base so I am trying to effectively create a new TA for this data source. Thankyou for any assistance.