All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Please try this if you want to add the port.   | makeresults | eval dest="example.com", dest_port=8441 | lookup sslcert_lookup dest dest_port OUTPUT ssl_subject_common_name ssl_subject_alt_name ssl... See more...
Please try this if you want to add the port.   | makeresults | eval dest="example.com", dest_port=8441 | lookup sslcert_lookup dest dest_port OUTPUT ssl_subject_common_name ssl_subject_alt_name ssl_end_time ssl_validity_window | eval ssl_subject_alt_name = split(ssl_subject_alt_name,"|") | eval days_left = round(ssl_validity_window/86400)
Here is the exact Message which have  message: [{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_ARO_P","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":36,"PROCESSING":0,"DATE":"6/27/2024",... See more...
Here is the exact Message which have  message: [{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_ARO_P","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":36,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_ARO_P","TOTAL":0,"PROCESSED":8,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_ARO_P","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":1,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00410","TOTAL":0,"PROCESSED":4,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_ARO_H","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":29,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00410","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":1,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_ARO_H","TOTAL":0,"PROCESSED":139,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23}]
Got working, not way a wanted but works index=1087_m365 sourcetype="o365:management:activity" authentication_service=AzureActiveDirectory "Actor{}.ID"="Azure MFA StrongAuthenticationService" |eval... See more...
Got working, not way a wanted but works index=1087_m365 sourcetype="o365:management:activity" authentication_service=AzureActiveDirectory "Actor{}.ID"="Azure MFA StrongAuthenticationService" |eval Device =mvindex('ModifiedProperties{}.NewValue', 0) | rex field=Device "\"DeviceName\": \"(?<DeviceName>[^\"]+)\"" | rex field=Device "\"PhoneAppVersion\": \"(?<PhoneAppVersion>[^\"]+)\"" | rex field=Device "\"DeviceToken\": \"(?<DeviceToken>[^\"]+)\"" | table user DeviceName PhoneAppVersion DeviceToken
Config validation failure reported in peer=usxzvrspidx1.usaccess.gsa.gov guid=62899FCC-C4E8-4A86-903D-C72234AE7F38. In index '_audit': Failed to create directory '/opt/splunk/var/lib/splunk/cold/audi... See more...
Config validation failure reported in peer=usxzvrspidx1.usaccess.gsa.gov guid=62899FCC-C4E8-4A86-903D-C72234AE7F38. In index '_audit': Failed to create directory '/opt/splunk/var/lib/splunk/cold/audit/colddb' (File exists); . I made a change to my my indexes: [wineventlog] homePath = volume:hotwarm/wineventlog/db coldPath = volume:cold/wineventlog/colddb thawedPath = $SPLUNK_DB/wineventlog/thaweddb maxDataSize = auto_high_volume coldPath.maxDataSizeMB = 0 maxWarmDBCount = 300 frozenTimePeriodInSecs = 33696000 repFactor = auto [syslog] homePath = volume:hotwarm/syslog/db coldPath = volume:cold/syslog/colddb thawedPath = $SPLUNK_DB/syslog/thaweddb repFactor = auto maxDataSize = auto_high_volume coldPath.maxDataSizeMB = 11059200 maxWarmDBCount = 4294967295 frozenTimePeriodInSecs = 33696000 Since this change  the indexers quit receiving data from their forwarders.   So I want to put the values back and Im getting this error when I want to apply the bundle change  Need help on how to fix this  
Recently, I installed new Splunk Enterprise 9.2.1 (on-prem) on RHEL8 server and have installed Universal Forwarders on bunch of Linux (RHEL and Ubuntu) and Windows clients and logs are being ingestin... See more...
Recently, I installed new Splunk Enterprise 9.2.1 (on-prem) on RHEL8 server and have installed Universal Forwarders on bunch of Linux (RHEL and Ubuntu) and Windows clients and logs are being ingesting fine. However, after waiting for few days, I installed Universal Forwarder on few more Linux machines (followed the same process as before) and installation was successful but logs are not showing up on indexers. I have checked and compared the inputs.conf, outputs.conf, server.conf under $SPLUNK_HOME/etc/system/local (like other hosts) and looks good.  I have done tcpdump on the clients and the indexers and clinent is sending logs to the indexer  I did look search for the new hosts in $SPLUNK_HOME/var/log/splunk and they do show up in metrics.log but when I search for index="my-index-name", I only see logs from the hosts that I installed last week; nothing from new UF I installed/configured yesterday.  What's the best way to troubleshoot further?    *This is a air-gapped environment, so I can't provide any logs.   
Hi, We are currently using the Azure extension to monitor metrics for Azure Application Gateway (AAG), specifically the Total Time metric. However, we have observed that the data displayed on our da... See more...
Hi, We are currently using the Azure extension to monitor metrics for Azure Application Gateway (AAG), specifically the Total Time metric. However, we have observed that the data displayed on our dashboard is not continuous, and it appears that some data points are missing. Could you please review the attached dashboard and provide your comments and guidance on this issue? Thank you for your assistance. Regards, Jeff 
Below is my row text in Splunk and i want to extract JSON array from  from below row text. After extract I want to do group by ARUNAME  and wanted to calculate the sum of SKIPPED and PROCESSED  valu... See more...
Below is my row text in Splunk and i want to extract JSON array from  from below row text. After extract I want to do group by ARUNAME  and wanted to calculate the sum of SKIPPED and PROCESSED  value  {"id":"0","severity":"Information","message":"[{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_P\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":36,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_P\",\"TOTAL\":0,\"PROCESSED\":8,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_P\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":1,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_00410\",\"TOTAL\":0,\"PROCESSED\":4,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_H\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":29,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_00410\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":1,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_H\",\"TOTAL\":0,\"PROCESSED\":139,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23}]"}  
Thank you !  
{"CreationTime": "2024-06-27T16:33:32", "Id": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Operation": "Update user.", "OrganizationId": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "RecordType": 8, "ResultS... See more...
{"CreationTime": "2024-06-27T16:33:32", "Id": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Operation": "Update user.", "OrganizationId": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "RecordType": 8, "ResultStatus": "Success", "UserKey": "Not Available", "UserType": 4, "Version": 1, "Workload": "AzureActiveDirectory", "ObjectId": "xxxxxxxxxxxxcom", "UserId": "ServicePrincipal_fxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "AzureActiveDirectoryEventType": 1, "ExtendedProperties": [{"Name": "additionalDetails", "Value": "{\"UserType\":\"Member\"}"}, {"Name": "extendedAuditEventCategory", "Value": "User"}], "ModifiedProperties": [{"Name": "StrongAuthenticationPhoneAppDetail", "NewValue": "[\r\n {\r\n \"DeviceName\": \"SM-A205U\",\r\n \"DeviceToken\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"DeviceTag\": \"Android\",\r\n \"PhoneAppVersion\": \"6.2404.2444\",\r\n \"OathTokenTimeDrift\": 0,\r\n \"DeviceId\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"Id\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"TimeInterval\": 0,\r\n \"AuthenticationType\": 3,\r\n \"NotificationType\": 4,\r\n \"LastAuthenticatedTimestamp\": \"2024-05-16T15:01:08.3691641Z\",\r\n \"AuthenticatorFlavor\": \"Authenticator\",\r\n \"HashFunction\": null,\r\n \"TenantDeviceId\": null,\r\n \"SecuredPartitionId\": 0,\r\n \"SecuredKeyId\": 0\r\n },\r\n {\r\n \"DeviceName\": \"SM-A205U\",\r\n \"DeviceToken\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxYJxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"DeviceTag\": \"SoftwareTokenActivated\",\r\n \"PhoneAppVersion\": \"6.2404.2444\",\r\n \"OathTokenTimeDrift\": 0,\r\n \"DeviceId\": \"00000000-0000-0000-0000-000000000000\",\r\n \"Id\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"TimeInterval\": 0,\r\n \"AuthenticationType\": 3,\r\n \"NotificationType\": 4,\r\n \"LastAuthenticatedTimestamp\": \"2024-05-14T16:08:39.6982523Z\",\r\n \"AuthenticatorFlavor\": \"Authenticator\",\r\n \"HashFunction\": null,\r\n \"TenantDeviceId\": null,\r\n \"SecuredPartitionId\": 0,\r\n \"SecuredKeyId\": 0\r\n },\r\n {\r\n \"DeviceName\": \"SM-S921U\",\r\n \"DeviceToken\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"DeviceTag\": \"Android\",\r\n \"PhoneAppVersion\": \"6.2406.4052\",\r\n \"OathTokenTimeDrift\": 0,\r\n \"DeviceId\": \"00000000-0000-0000-0000-000000000000\",\r\n \"Id\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"TimeInterval\": 0,\r\n \"AuthenticationType\": 3,\r\n \"NotificationType\": 4,\r\n \"LastAuthenticatedTimestamp\": \"2024-06-25T16:23:06.2912051Z\",\r\n \"AuthenticatorFlavor\": \"Authenticator\",\r\n \"HashFunction\": \"hmacsha256\",\r\n \"TenantDeviceId\": null,\r\n \"SecuredPartitionId\": 0,\r\n \"SecuredKeyId\": 0\r\n },\r\n {\r\n \"DeviceName\": \"SM-A205U\",\r\n \"DeviceToken\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"DeviceTag\": \"Android\",\r\n \"PhoneAppVersion\": \"6.2404.2444\",\r\n \"OathTokenTimeDrift\": 0,\r\n \"DeviceId\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"Id\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"TimeInterval\": 0,\r\n \"AuthenticationType\": 3,\r\n \"NotificationType\": 4,\r\n \"LastAuthenticatedTimestamp\": \"2024-05-16T15:01:08.3691641Z\",\r\n \"AuthenticatorFlavor\": \"Authenticator\",\r\n \"HashFunction\": null,\r\n \"TenantDeviceId\": null,\r\n \"SecuredPartitionId\": 0,\r\n \"SecuredKeyId\": 0\r\n },\r\n {\r\n \"DeviceName\": \"SM-A205U\",\r\n \"DeviceToken\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"DeviceTag\": \"SoftwareTokenActivated\",\r\n \"PhoneAppVersion\": \"6.2404.2444\",\r\n \"OathTokenTimeDrift\": 0,\r\n \"DeviceId\": \"00000000-0000-0000-0000-000000000000\",\r\n \"Id\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"TimeInterval\": 0,\r\n \"AuthenticationType\": 3,\r\n \"NotificationType\": 4,\r\n \"LastAuthenticatedTimestamp\": \"2024-05-14T16:08:39.6982523Z\",\r\n \"AuthenticatorFlavor\": \"Authenticator\",\r\n \"HashFunction\": null,\r\n \"TenantDeviceId\": null,\r\n \"SecuredPartitionId\": 0,\r\n \"SecuredKeyId\": 0\r\n }\r\n]"}, {"Name": "Included Updated Properties", "NewValue": "StrongAuthenticationPhoneAppDetail", "OldValue": ""}, {"Name": "TargetId.UserType", "NewValue": "Member", "OldValue": ""}], "Actor": [{"ID": "Azure MFA StrongAuthenticationService", "Type": 1}, {"ID": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Type": 2}, {"ID": "ServicePrincipalxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Type": 2}, {"ID": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Type": 2}, {"ID": "ServicePrincipal", "Type": 2}], "ActorContextId": "xxxxxxxxxxxxxxxxxxxxxxxxxxx", "InterSystemsId": "xxxxxxxxxxxxxxxxxxxxxxxxxx", "IntraSystemId": "xxxxxxxxxxxxxxxxxxxxxxxxxx", "SupportTicketId": "", "Target": [{"ID": "Userxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Type": 2}, {"ID": "xxxxxxxxxxxxxxxxxxxxxxx", "Type": 2}, {"ID": "User", "Type": 2}, {"ID": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Type": 5}, {"ID": "xxxxxxxxxxxxxxxxxxxxxxxx", "Type": 3}], "TargetContextId": "4xxxxxxxxxxxxxxxxxxxxxxa48xxxxxxx46"}
Hello @yuanliu , your suggestion was exactly what I needed. Thanks to your initial query, I was able to achieve the desired outcome with some adjustments. Your detailed explanation was greatly apprec... See more...
Hello @yuanliu , your suggestion was exactly what I needed. Thanks to your initial query, I was able to achieve the desired outcome with some adjustments. Your detailed explanation was greatly appreciated.
Thank you @richgalloway     for your insightful article that provided me with a good starting point.
Assuming events are already sorted in time order, try something like this | streamstats window=1 current=f values(eventOrder) as previous by formDataId | where previous > eventOrder
Please share your full event in raw format, anonymised appropriately.
Hi All, We have an application that gets events in from an external party but occasionally we see out of sequence events that occur due to underlying issues with the MQ interface [guaranteed deliver... See more...
Hi All, We have an application that gets events in from an external party but occasionally we see out of sequence events that occur due to underlying issues with the MQ interface [guaranteed delivery but not necessarily in correct order].  Identifying out of sequence events would then point to an issue with the underlying MQ.  Given this set of data.. | makeresults format=csv data="timelogged, formDataId, eventOrder 00:02,AA,2 00:03,AA,3 00:04,AA,3 00:05,AA,4 00:06,AA,5 00:07,AA,9 01:02,BB,2 01:03,BB,3 01:04,BB,3 01:05,BB,4 01:07,BB,9 01:08,BB,5 02:02,CC,2 02:03,CC,3 02:04,CC,3 02:05,CC,4 02:06,CC,5 02:07,CC,9 03:01,DD,1 04:02,EE,2 04:03,EE,4 04:04,EE,3 04:05,EE,9" | table timelogged, formDataId, eventOrder ...how could the highlighted transactions be identified? Note: We do not get all types of events and the 'first' event is not usually seen [as indicates an error on vendor side]
I am trying to get DeviceName and DeviceToken to var from 365 log first I use eval Device =mvindex('ModifiedProperties{}.NewValue', 0) which retuns another MV with the data I want but can seem to g... See more...
I am trying to get DeviceName and DeviceToken to var from 365 log first I use eval Device =mvindex('ModifiedProperties{}.NewValue', 0) which retuns another MV with the data I want but can seem to get to the field. Below is what Device shows in editor. Any help? What something like eval DeviceName = ModifiedProperties{}.NewValue{0}.DeviceName but nothing is right I try. Tried to save as sting and extract but even that I cant figure out. Its the Mv in a MV I think is throwing me.  [ { "DeviceName": "iPhone 13 mini", "DeviceToken": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "DeviceTag": "SoftwareTokenActivated", "PhoneAppVersion": "6.8.11", "OathTokenTimeDrift": 0, "DeviceId": "00000000-0000-0000-0000-000000000000", "Id": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "TimeInterval": 0, "AuthenticationType": 3, "NotificationType": 2, "LastAuthenticatedTimestamp": "2024-06-27T15:00:42.8784693Z", "AuthenticatorFlavor": null, "HashFunction": null, "TenantDeviceId": null, "SecuredPartitionId": 0, "SecuredKeyId": 0 } ]
Was there ever a solution for this? I am having the same error except using Apache
Try something like this index=main ExportConfigInfo "MessageJob started" OR "MessageJob completed" | eval start=if(searchmatch("MessageJob started"),_time,null()) | eval end=if(searchmatch("MessageJ... See more...
Try something like this index=main ExportConfigInfo "MessageJob started" OR "MessageJob completed" | eval start=if(searchmatch("MessageJob started"),_time,null()) | eval end=if(searchmatch("MessageJob completed"),_time,null()) | bin _time span=1d | stats min(start) as start, max(end) as end by _time | eval diff=end-start | eval difference=tostring(diff, "duration")
I have a search that returns two results per day (a job's log entry of when it started and when it ended). I want to be able to see the time difference between the two entries, grouped by day. I'm a... See more...
I have a search that returns two results per day (a job's log entry of when it started and when it ended). I want to be able to see the time difference between the two entries, grouped by day. I'm a newbie to Splunk advanced searching so hopefully you can help. My query is: index=main ExportConfigInfo AND ("Message=Job started" OR "Message=Job completed")  
Is it that you want to replace the spaces with new lines? Something like this perhaps?   | eval hosts=replace(hosts," "," ")  
First, you want to familiarize yourself with where command and how it differs from search command.  As @ITWhisperer said, search operates on _raw field.  Because inputlookup does not produce raw even... See more...
First, you want to familiarize yourself with where command and how it differs from search command.  As @ITWhisperer said, search operates on _raw field.  Because inputlookup does not produce raw events, you need to specify which field or fields from data_source.csv to apply that regex.  Suppose all you want to do is to match a field named somefield, your search can be simply: | inputlookup data_source.csv | where (isnull(count) AND isnull(percent)) OR match(somefield, "[^0-9a-zA-Z\-\._,]") Here, there is no need to fillnull because isnull function test the condition without a spurious assignment. Now, if you want to apply that regex to every field from this lookup, the following should work but that's really not what Splunk is designed to do. | inputlookup data_source.csv | foreach * [eval allfields = if(isnull(allfields), "", allfields) . <<FIELD>>] | where (isnull(count) AND isnull(percent)) OR match(allfields, "[^0-9a-zA-Z\-\._,]")