All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Sadly, no there is no field for login/logout, this is why I am trying to calculate based on if there are events or activity for each user. Filtration is being made by source zone field. Sample ev... See more...
Sadly, no there is no field for login/logout, this is why I am trying to calculate based on if there are events or activity for each user. Filtration is being made by source zone field. Sample event: Jun 24 15:01:20 10.50.8.100 1 2024-06-24T15:01:20+03:00 pafw01.company.com.sa - - - - 1,2024/06/24 15:01:19,007959000163983,TRAFFIC,end,2561,2024/06/24 15:01:19,192.168.44.43,10.130.11.2,0.0.0.0,0.0.0.0,GP-Access-Organization-Services-Applications,company\user1,,ssl,vsys1,GP-VPN,Trust,tunnel.21,ethernet1/4,splunk-forwarding,2024/06/24 15:01:19,1269402,1,61723,443,0,0,0x47a,tcp,allow,33254,13498,19756,210,2024/06/24 14:36:36,1454,White-List,,7352086992805546250,0x0,192.168.0.0-192.168.255.255,10.0.0.0-10.255.255.255,,105,105,tcp-rst-from-client,0,0,0,0,,pafw01,from-policy,,,0,,0,,N/A,0,0,0,0,09a8fe83-e848-4cbb-bdff-0d35a4ce96b2,0,0,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,2024-06-24T15:01:20.681+03:00,,,encrypted-tunnel,networking,browser-based,4,"used-by-malware,able-to-transfer-file,has-known-vulnerability,tunnel-other-application,pervasive-use",,ssl,no,no,0
This is a fairly common question here. Use the search for other users' solutions. The answer will depend on what data you have about your user - most importantly whether you have a separate "log in"... See more...
This is a fairly common question here. Use the search for other users' solutions. The answer will depend on what data you have about your user - most importantly whether you have a separate "log in" and "log out" events for a user or you simply have one kind of a "presence" event which is supposed to be logged in fairly regular intervals as long as the user is logged in and its absence means that user has disconnected? Additional caveat with the log in/log out events is - what if the user disconnected without logging out (or the log out event simply got "lost" - which can happen when receiving data via UDP syslog). Anyway, you will need to use streamstats to either find last log in for a given log out or find the previous event to decide whether the current one is sufficiently "far back" to constitute a new session.
You presumably have more detail in your events which tell you that the user wasn't logged on for all this time? Please share some anonymised representative events which would allow you to determine w... See more...
You presumably have more detail in your events which tell you that the user wasn't logged on for all this time? Please share some anonymised representative events which would allow you to determine when the user logged on and when they logged off. Alternatively, do the event have some sort of session identifier which allows you to determine how long each session lasted?
we tried to implement this TA in our environment at both indexers and HF level, still data is not parsed, we are sending data from TM to syslog-ng server and reading through HF to sent to IDX. we tri... See more...
we tried to implement this TA in our environment at both indexers and HF level, still data is not parsed, we are sending data from TM to syslog-ng server and reading through HF to sent to IDX. we tried using sourcetype=trendmicro and still data is not parsed into other sourcetype. Kindly let me know what i am doing wrong here.
Dears,   I am trying to calculate how the total duration each user spends connected through VPN, their total online time.   I am using the below search, but the issue is for example in a 24 hour ... See more...
Dears,   I am trying to calculate how the total duration each user spends connected through VPN, their total online time.   I am using the below search, but the issue is for example in a 24 hour range if the user logged in only for 10 minutes at 1AM then again for 1 hour at 11AM the duration output will be 10 hours as it takes the very first event then the very last event. How can I calculate based on timeslots that have events only?   index=pa src_zone="GP-VPN" src_user="*" | stats earliest(_time) AS earliest latest(_time) AS latest BY src_user | eval duration = tostring((latest-earliest)/60)     Timeline below, should be ~14hours:     Search Results, duration in minutes, resulting in 24 hours which is not correct due to gap time: user earliest latest duration user1 1719144008.192 1719230507.192 1441.6500
Thank you so much  Join worked as expected !
What exactly did you change and what were the expected results? The comments in transforms.conf and props.conf must not be un-commented because they are not valid settings.
Hi @ITWhisperer  You provided rex is also not working as expected. 
we are collecting data over VPN site to site, so to manage properly and for security policies, instead of allowing all ips to communicate with IDX we only allowed HF working as IF to connect to IDX a... See more...
we are collecting data over VPN site to site, so to manage properly and for security policies, instead of allowing all ips to communicate with IDX we only allowed HF working as IF to connect to IDX and all UFs are connected to IF   btw thanks for your response. can you provide some documentation for this
Hi @Nawab , compression must be applied both on connections between UFs and IF and IF and IDXs. Only one question: why do you need an IF? Ciao. Giuseppe
Thank you for your supporting, i have updated the information, sorry because i did not capture it  
We have multiple forwarders sending data to an Intermediary forwarder and that IF is sending data to IDXs. IF is not storing any data in this case.   If we do compression on IF, will it automatical... See more...
We have multiple forwarders sending data to an Intermediary forwarder and that IF is sending data to IDXs. IF is not storing any data in this case.   If we do compression on IF, will it automatically apply on data coming from UFs or should we do this config on all UFs as well.
Here are the content of the local.meta file: [app/install/install_source_checksum] version = 9.2.1 modtime = 1719187793.873274000 [] access = read : [ * ], write : [ * ] export = system versio... See more...
Here are the content of the local.meta file: [app/install/install_source_checksum] version = 9.2.1 modtime = 1719187793.873274000 [] access = read : [ * ], write : [ * ] export = system version = 9.2.1 modtime = 1719188413.914413000
Hi @ITWhisperer  the below error message I got The extraction failed. If you are extracting multiple fields, try removing one or more fields. Start with extractions that are embedded within lon... See more...
Hi @ITWhisperer  the below error message I got The extraction failed. If you are extracting multiple fields, try removing one or more fields. Start with extractions that are embedded within longer text strings.
Please provide more specific examples of the events you are dealing with.
Dashboards are essentially representations of search results. Do you have some searches that provide the information you want from your logs?
Firstly, this looks like JSON so you should probably look to use JSON extractions. If you are getting errors with this, then perhaps you could share what you tried and what errors you got, and perhap... See more...
Firstly, this looks like JSON so you should probably look to use JSON extractions. If you are getting errors with this, then perhaps you could share what you tried and what errors you got, and perhaps it can be resolved that way. However, if you want to continue down the rex track (not recommended), you could try something like this | rex "\"CrmId\": \"(?<CrmId>[^\"]+).*\"status\": \"(?<status>[^\"]+).*\"source\": \"(?<source>[^\"]+).*\"leadId\": \"(?<leadId>[^\"]+).*\"isFirstLead\": \"(?<isFirstLead>[^\"]+).*\"offersinPrinciple\": \"(?<offersinPrinciple>[^\"]+).*\"sourceSiteId\": \"(?<sourceSiteId>[^\"]+).*\"howDidYouHear\": \"(?<howDidYouHear>[^\"]+)"
Hi team, I need to extract the highlighted field in the below messege using regex... I have tried Splunk inbuilt field extraction it is throwing error when i use multiple field...   { "eventTime":... See more...
Hi team, I need to extract the highlighted field in the below messege using regex... I have tried Splunk inbuilt field extraction it is throwing error when i use multiple field...   { "eventTime": "2024-06-24T06:15:42Z", "leaduuid": "1234455", "CrmId": "11111111", "studentCrmUuid": "634543564", "externalId": "", "SiteId": "xxxx", "subCategory": "", "category": "Course Enquiry", "eventId": "", "eventRegistrationId": "", "status": "Open", "source": "Online Enquiry", "leadId": "22222222",  "assignmentStatusCode": "", "assignmentStatus": "", "isFirstLead": "yes", "c4cEventId": "", "channelPartnerApplication": "no", "applicationReceivedDate": "", "referredBy": "", "referrerCounsellor": "", "createdBy": "Technical User",  "lastChangedBy": "Technical User" , "leadSubAgentID": "", "cancelReason": ""}, "offersInPrinciple": {"offersinPrinciple": "no", "oipReferenceNumber": "", "oipVerificationStatus": ""}, "qualification": {"qualification": "Unqualified", "primaryFinancialSource": ""}, "online": {"referringUrl": "", "idpNearestOffice": "", "sourceSiteId": "xxxxx", "preferredCounsellingMode": "", "institutionInfo": "", "courseName": "", "howDidYouHear": "Social Media"}
Hi Team, We are setting up minimalistic dashboards for application logs. application logs include local server logs, application logs, tibco logs, kibana logs. is there a standard dashboard setup ... See more...
Hi Team, We are setting up minimalistic dashboards for application logs. application logs include local server logs, application logs, tibco logs, kibana logs. is there a standard dashboard setup available for application log monitoring dashboard. Please guide me to create one dashboard for application log monitoring.   Thanks, 
Hello everyone, I am a newbie in this field, I am looking forward to your help. I am using Eventgen to create data samples for Splunk Enterprise.  I have a datamodel "Test", a dataset "datasetA" in... See more...
Hello everyone, I am a newbie in this field, I am looking forward to your help. I am using Eventgen to create data samples for Splunk Enterprise.  I have a datamodel "Test", a dataset "datasetA" in that datamodel, "datasetB" inherited from "datasetA" and "datasetC" inherited from "datasetB". All the data samples are satisfy with the base search and constraints of all datasets. It means all data samples are the sample in 3 datasets above. The problem is there are values of datasetA.fieldname, but not for datasetB.fieldname even datasetB is inherited from datasetA. Is there anyone have the same problem? More information: Sorry because i do not capture it   example: |tstats values(datasetA.action) from datamodel=Test ->result: 3 actions |stats values(datasetA.datasetB.action) from datamodel=Test ->result: no result found The data samples in datasetA and datasetB is the same Thank you for reading