All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The IN operator is not supported by the where command.  You can use IN with the search command or the in() function with the where command.  In this case, however, the IN is not needed if the subsear... See more...
The IN operator is not supported by the where command.  You can use IN with the search command or the in() function with the where command.  In this case, however, the IN is not needed if the subsearch is part of the base search (before the first pipe).     index=provisioning_index cf_org_name=abcd cf_app_name=xyz "ReconCount:" jobNumber [search index=provisioning_index cf_org_name=abcd cf_app_name=xyz operation="operation1" status=SUCCESS |search NOT jobType="Canc"|table jobNumber ] | stats count by deliveryInd | addcoltotals      
Nevermind All.     By some miracle I figured it out! | eval ProdCode = replace(ProdCode,"^(\d)\d{3}-\d(\d{3})","\1xxx-x\2")
Hello  I know its already 2 weeks but still waiting for answer can any one help me out
Hello,  I'm fairly new to splunk, trying to search using where clause and filter the results. The query is running long, wondering if i'm not doin this right. a tone down version of the search: ... See more...
Hello,  I'm fairly new to splunk, trying to search using where clause and filter the results. The query is running long, wondering if i'm not doin this right. a tone down version of the search: index=provisioning_index cf_org_name=abcd cf_app_name=xyz "ReconCount:" |where jobNumber IN ([search index=provisioning_index cf_org_name=abcd cf_app_name=xyz operation="operation1" status=SUCCESS |search NOT jobType="Canc"|table jobNumber ]) |stats count by deliveryInd | addcoltotals
In Splunk, I added an AWS add-on and tried to get data from AWS S3. While creating the input, it took the sourcetype as aws:s3:csv by default, and I was receiving the data properly. However, I accide... See more...
In Splunk, I added an AWS add-on and tried to get data from AWS S3. While creating the input, it took the sourcetype as aws:s3:csv by default, and I was receiving the data properly. However, I accidentally changed the configuration for the aws:s3:csv sourcetype, and now the logs are not being received correctly. Can anyone help me by providing the default configuration for this sourcetype?"
Here is the answer: https://docs.splunk.com/Documentation/Splunk/9.2.1/Admin/Web-featuresconf#.5Bfeature:dashboards_csp.5D in web-features.conf, there is a stanza called  [feature:dashboards_csp... See more...
Here is the answer: https://docs.splunk.com/Documentation/Splunk/9.2.1/Admin/Web-featuresconf#.5Bfeature:dashboards_csp.5D in web-features.conf, there is a stanza called  [feature:dashboards_csp] where you can allow list domains like this: dashboards_trusted_domain.<name> = <string> aka dashboards_trusted_domain.smartsheet = app.smartsheet.com
SE ver 9.1.2 Upgrading from ES 7.2 to 7.3.1.  Ran the install (expands the SPL out to the respective apps) Restarted Splunkd Went into UI and started ES ap where its stating that ES has not been f... See more...
SE ver 9.1.2 Upgrading from ES 7.2 to 7.3.1.  Ran the install (expands the SPL out to the respective apps) Restarted Splunkd Went into UI and started ES ap where its stating that ES has not been fully configured and to hit the green button to do so (continue to app setup page) When I do, I get the error message popup of: ParsingError: Source contains parsing errors: '<string>' [line 2]: '[]\n' Hit ok and we dont go anywhere... can go to other apps so Splunk Enterprise is ok... ES... no so ok... Help
Hi.    Your reply is greatly appreciated, but I must use the eval command to achieve my results. Do you have an eval command solution?  
The current regex takes the first 4 digits and the last 4 digits and then puts them back together, which is why the result does not change.  Try this, which takes the first and last 3 digits and puts... See more...
The current regex takes the first 4 digits and the last 4 digits and then puts them back together, which is why the result does not change.  Try this, which takes the first and last 3 digits and puts them together. | rex field=AcctCode mode=sed "s/(\d{3})\d-\d(\d{3})/\1\2/"  
When using regex how can I take a field formatted as "0012-4250" and only show the 1st and lat 3 digits? I tried the following in which maintains the original output: | eval AcctCode = replace(Acct... See more...
When using regex how can I take a field formatted as "0012-4250" and only show the 1st and lat 3 digits? I tried the following in which maintains the original output: | eval AcctCode = replace(AcctCode,"(\d{4}-)(\d{4})","\1\2")
Please try this if you want to add the port.   | makeresults | eval dest="example.com", dest_port=8441 | lookup sslcert_lookup dest dest_port OUTPUT ssl_subject_common_name ssl_subject_alt_name ssl... See more...
Please try this if you want to add the port.   | makeresults | eval dest="example.com", dest_port=8441 | lookup sslcert_lookup dest dest_port OUTPUT ssl_subject_common_name ssl_subject_alt_name ssl_end_time ssl_validity_window | eval ssl_subject_alt_name = split(ssl_subject_alt_name,"|") | eval days_left = round(ssl_validity_window/86400)
Here is the exact Message which have  message: [{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_ARO_P","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":36,"PROCESSING":0,"DATE":"6/27/2024",... See more...
Here is the exact Message which have  message: [{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_ARO_P","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":36,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_ARO_P","TOTAL":0,"PROCESSED":8,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_ARO_P","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":1,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00410","TOTAL":0,"PROCESSED":4,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_ARO_H","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":29,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_00410","TOTAL":0,"PROCESSED":0,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":1,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23},{"TARGETSYSTEM":"CPW","ARUNAME":"CPW_ARO_H","TOTAL":0,"PROCESSED":139,"REMAINING":0,"ERROR":0,"FAILED":0,"SKIPPED":0,"PROCESSING":0,"DATE":"6/27/2024","DAYHOUR":23}]
Got working, not way a wanted but works index=1087_m365 sourcetype="o365:management:activity" authentication_service=AzureActiveDirectory "Actor{}.ID"="Azure MFA StrongAuthenticationService" |eval... See more...
Got working, not way a wanted but works index=1087_m365 sourcetype="o365:management:activity" authentication_service=AzureActiveDirectory "Actor{}.ID"="Azure MFA StrongAuthenticationService" |eval Device =mvindex('ModifiedProperties{}.NewValue', 0) | rex field=Device "\"DeviceName\": \"(?<DeviceName>[^\"]+)\"" | rex field=Device "\"PhoneAppVersion\": \"(?<PhoneAppVersion>[^\"]+)\"" | rex field=Device "\"DeviceToken\": \"(?<DeviceToken>[^\"]+)\"" | table user DeviceName PhoneAppVersion DeviceToken
Config validation failure reported in peer=usxzvrspidx1.usaccess.gsa.gov guid=62899FCC-C4E8-4A86-903D-C72234AE7F38. In index '_audit': Failed to create directory '/opt/splunk/var/lib/splunk/cold/audi... See more...
Config validation failure reported in peer=usxzvrspidx1.usaccess.gsa.gov guid=62899FCC-C4E8-4A86-903D-C72234AE7F38. In index '_audit': Failed to create directory '/opt/splunk/var/lib/splunk/cold/audit/colddb' (File exists); . I made a change to my my indexes: [wineventlog] homePath = volume:hotwarm/wineventlog/db coldPath = volume:cold/wineventlog/colddb thawedPath = $SPLUNK_DB/wineventlog/thaweddb maxDataSize = auto_high_volume coldPath.maxDataSizeMB = 0 maxWarmDBCount = 300 frozenTimePeriodInSecs = 33696000 repFactor = auto [syslog] homePath = volume:hotwarm/syslog/db coldPath = volume:cold/syslog/colddb thawedPath = $SPLUNK_DB/syslog/thaweddb repFactor = auto maxDataSize = auto_high_volume coldPath.maxDataSizeMB = 11059200 maxWarmDBCount = 4294967295 frozenTimePeriodInSecs = 33696000 Since this change  the indexers quit receiving data from their forwarders.   So I want to put the values back and Im getting this error when I want to apply the bundle change  Need help on how to fix this  
Recently, I installed new Splunk Enterprise 9.2.1 (on-prem) on RHEL8 server and have installed Universal Forwarders on bunch of Linux (RHEL and Ubuntu) and Windows clients and logs are being ingestin... See more...
Recently, I installed new Splunk Enterprise 9.2.1 (on-prem) on RHEL8 server and have installed Universal Forwarders on bunch of Linux (RHEL and Ubuntu) and Windows clients and logs are being ingesting fine. However, after waiting for few days, I installed Universal Forwarder on few more Linux machines (followed the same process as before) and installation was successful but logs are not showing up on indexers. I have checked and compared the inputs.conf, outputs.conf, server.conf under $SPLUNK_HOME/etc/system/local (like other hosts) and looks good.  I have done tcpdump on the clients and the indexers and clinent is sending logs to the indexer  I did look search for the new hosts in $SPLUNK_HOME/var/log/splunk and they do show up in metrics.log but when I search for index="my-index-name", I only see logs from the hosts that I installed last week; nothing from new UF I installed/configured yesterday.  What's the best way to troubleshoot further?    *This is a air-gapped environment, so I can't provide any logs.   
Hi, We are currently using the Azure extension to monitor metrics for Azure Application Gateway (AAG), specifically the Total Time metric. However, we have observed that the data displayed on our da... See more...
Hi, We are currently using the Azure extension to monitor metrics for Azure Application Gateway (AAG), specifically the Total Time metric. However, we have observed that the data displayed on our dashboard is not continuous, and it appears that some data points are missing. Could you please review the attached dashboard and provide your comments and guidance on this issue? Thank you for your assistance. Regards, Jeff 
Below is my row text in Splunk and i want to extract JSON array from  from below row text. After extract I want to do group by ARUNAME  and wanted to calculate the sum of SKIPPED and PROCESSED  valu... See more...
Below is my row text in Splunk and i want to extract JSON array from  from below row text. After extract I want to do group by ARUNAME  and wanted to calculate the sum of SKIPPED and PROCESSED  value  {"id":"0","severity":"Information","message":"[{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_P\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":36,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_P\",\"TOTAL\":0,\"PROCESSED\":8,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_P\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":1,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_00410\",\"TOTAL\":0,\"PROCESSED\":4,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_H\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":29,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_00410\",\"TOTAL\":0,\"PROCESSED\":0,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":1,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23},{\"TARGETSYSTEM\"‌:confused_face:‌"CPW\",\"ARUNAME\"‌:confused_face:‌"CPW_ARO_H\",\"TOTAL\":0,\"PROCESSED\":139,\"REMAINING\":0,\"ERROR\":0,\"FAILED\":0,\"SKIPPED\":0,\"PROCESSING\":0,\"DATE\"‌:confused_face:‌"6/27/2024\",\"DAYHOUR\":23}]"}  
Thank you !  
{"CreationTime": "2024-06-27T16:33:32", "Id": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Operation": "Update user.", "OrganizationId": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "RecordType": 8, "ResultS... See more...
{"CreationTime": "2024-06-27T16:33:32", "Id": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Operation": "Update user.", "OrganizationId": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "RecordType": 8, "ResultStatus": "Success", "UserKey": "Not Available", "UserType": 4, "Version": 1, "Workload": "AzureActiveDirectory", "ObjectId": "xxxxxxxxxxxxcom", "UserId": "ServicePrincipal_fxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "AzureActiveDirectoryEventType": 1, "ExtendedProperties": [{"Name": "additionalDetails", "Value": "{\"UserType\":\"Member\"}"}, {"Name": "extendedAuditEventCategory", "Value": "User"}], "ModifiedProperties": [{"Name": "StrongAuthenticationPhoneAppDetail", "NewValue": "[\r\n {\r\n \"DeviceName\": \"SM-A205U\",\r\n \"DeviceToken\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"DeviceTag\": \"Android\",\r\n \"PhoneAppVersion\": \"6.2404.2444\",\r\n \"OathTokenTimeDrift\": 0,\r\n \"DeviceId\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"Id\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"TimeInterval\": 0,\r\n \"AuthenticationType\": 3,\r\n \"NotificationType\": 4,\r\n \"LastAuthenticatedTimestamp\": \"2024-05-16T15:01:08.3691641Z\",\r\n \"AuthenticatorFlavor\": \"Authenticator\",\r\n \"HashFunction\": null,\r\n \"TenantDeviceId\": null,\r\n \"SecuredPartitionId\": 0,\r\n \"SecuredKeyId\": 0\r\n },\r\n {\r\n \"DeviceName\": \"SM-A205U\",\r\n \"DeviceToken\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxYJxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"DeviceTag\": \"SoftwareTokenActivated\",\r\n \"PhoneAppVersion\": \"6.2404.2444\",\r\n \"OathTokenTimeDrift\": 0,\r\n \"DeviceId\": \"00000000-0000-0000-0000-000000000000\",\r\n \"Id\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"TimeInterval\": 0,\r\n \"AuthenticationType\": 3,\r\n \"NotificationType\": 4,\r\n \"LastAuthenticatedTimestamp\": \"2024-05-14T16:08:39.6982523Z\",\r\n \"AuthenticatorFlavor\": \"Authenticator\",\r\n \"HashFunction\": null,\r\n \"TenantDeviceId\": null,\r\n \"SecuredPartitionId\": 0,\r\n \"SecuredKeyId\": 0\r\n },\r\n {\r\n \"DeviceName\": \"SM-S921U\",\r\n \"DeviceToken\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"DeviceTag\": \"Android\",\r\n \"PhoneAppVersion\": \"6.2406.4052\",\r\n \"OathTokenTimeDrift\": 0,\r\n \"DeviceId\": \"00000000-0000-0000-0000-000000000000\",\r\n \"Id\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"TimeInterval\": 0,\r\n \"AuthenticationType\": 3,\r\n \"NotificationType\": 4,\r\n \"LastAuthenticatedTimestamp\": \"2024-06-25T16:23:06.2912051Z\",\r\n \"AuthenticatorFlavor\": \"Authenticator\",\r\n \"HashFunction\": \"hmacsha256\",\r\n \"TenantDeviceId\": null,\r\n \"SecuredPartitionId\": 0,\r\n \"SecuredKeyId\": 0\r\n },\r\n {\r\n \"DeviceName\": \"SM-A205U\",\r\n \"DeviceToken\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"DeviceTag\": \"Android\",\r\n \"PhoneAppVersion\": \"6.2404.2444\",\r\n \"OathTokenTimeDrift\": 0,\r\n \"DeviceId\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"Id\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"TimeInterval\": 0,\r\n \"AuthenticationType\": 3,\r\n \"NotificationType\": 4,\r\n \"LastAuthenticatedTimestamp\": \"2024-05-16T15:01:08.3691641Z\",\r\n \"AuthenticatorFlavor\": \"Authenticator\",\r\n \"HashFunction\": null,\r\n \"TenantDeviceId\": null,\r\n \"SecuredPartitionId\": 0,\r\n \"SecuredKeyId\": 0\r\n },\r\n {\r\n \"DeviceName\": \"SM-A205U\",\r\n \"DeviceToken\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"DeviceTag\": \"SoftwareTokenActivated\",\r\n \"PhoneAppVersion\": \"6.2404.2444\",\r\n \"OathTokenTimeDrift\": 0,\r\n \"DeviceId\": \"00000000-0000-0000-0000-000000000000\",\r\n \"Id\": \"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx\",\r\n \"TimeInterval\": 0,\r\n \"AuthenticationType\": 3,\r\n \"NotificationType\": 4,\r\n \"LastAuthenticatedTimestamp\": \"2024-05-14T16:08:39.6982523Z\",\r\n \"AuthenticatorFlavor\": \"Authenticator\",\r\n \"HashFunction\": null,\r\n \"TenantDeviceId\": null,\r\n \"SecuredPartitionId\": 0,\r\n \"SecuredKeyId\": 0\r\n }\r\n]"}, {"Name": "Included Updated Properties", "NewValue": "StrongAuthenticationPhoneAppDetail", "OldValue": ""}, {"Name": "TargetId.UserType", "NewValue": "Member", "OldValue": ""}], "Actor": [{"ID": "Azure MFA StrongAuthenticationService", "Type": 1}, {"ID": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Type": 2}, {"ID": "ServicePrincipalxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Type": 2}, {"ID": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Type": 2}, {"ID": "ServicePrincipal", "Type": 2}], "ActorContextId": "xxxxxxxxxxxxxxxxxxxxxxxxxxx", "InterSystemsId": "xxxxxxxxxxxxxxxxxxxxxxxxxx", "IntraSystemId": "xxxxxxxxxxxxxxxxxxxxxxxxxx", "SupportTicketId": "", "Target": [{"ID": "Userxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Type": 2}, {"ID": "xxxxxxxxxxxxxxxxxxxxxxx", "Type": 2}, {"ID": "User", "Type": 2}, {"ID": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "Type": 5}, {"ID": "xxxxxxxxxxxxxxxxxxxxxxxx", "Type": 3}], "TargetContextId": "4xxxxxxxxxxxxxxxxxxxxxxa48xxxxxxx46"}
Hello @yuanliu , your suggestion was exactly what I needed. Thanks to your initial query, I was able to achieve the desired outcome with some adjustments. Your detailed explanation was greatly apprec... See more...
Hello @yuanliu , your suggestion was exactly what I needed. Thanks to your initial query, I was able to achieve the desired outcome with some adjustments. Your detailed explanation was greatly appreciated.