All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @tawm_12 , The simplest method is often configuring your applications within the containers to log to stdout/stderr and then using the Docker Splunk logging driver to forward these logs directly ... See more...
Hi @tawm_12 , The simplest method is often configuring your applications within the containers to log to stdout/stderr and then using the Docker Splunk logging driver to forward these logs directly to your Splunk Cloud HEC endpoint. If your applications must log to files within the container filesystem, you can use a Universal Forwarder (UF) sidecar container. Method 1: Docker Logging Driver (Recommended if apps log to stdout/stderr) Configure your application inside the Docker container to write its logs to standard output (stdout) and standard error (stderr). This is a common practice for containerized applications. Configure the Docker daemon or individual containers to use the splunk logging driver, pointing it to your Splunk Cloud HEC endpoint and token. Example docker run command: docker run \ --log-driver=splunk \ --log-opt splunk-token= \ --log-opt splunk-url=https://:8088 \ --log-opt splunk-format=json \ --log-opt splunk-verify-connection=false \ # Add other options like splunk-sourcetype, splunk-index, tag, etc. your-application-image This method leverages Docker's built-in logging capabilities. The driver captures the container's stdout/stderr streams (which contain your application logs if configured correctly) and forwards them via HEC. Method 2: Universal Forwarder Sidecar (If apps log to files) Deploy a Splunk Universal Forwarder container alongside your application container. Mount the volume containing the application log files into both the application container (for writing) and the UF container (for reading). Configure the UF container's inputs.conf to monitor the log files within the mounted volume. Configure the UF container's outputs.conf to forward data to your Splunk Cloud HEC endpoint or an intermediate Heavy Forwarder. Using HEC output from the UF is generally preferred for Splunk Cloud. Example UF inputs.conf: [monitor:///path/to/mounted/logs/app.log] sourcetype = your_app_sourcetype index = your_app_index disabled = false Example UF outputs.conf (for HEC): [httpout] uri = https://:8088 hecToken = # Consider sslVerifyServerCert = true in production after cert setup sslVerifyServerCert = false useACK = true [tcpout:splunk_cloud_forwarder] server = : # Use if forwarding via UF->HF->Splunk Cloud S2S # Other S2S settings... # disabled = true # Disable if using httpout The UF actively monitors the specified log files and forwards new events. This is suitable when applications cannot log to stdout/stderr. The UF sidecar runs in parallel with your app container, sharing the log volume. The Docker logging driver does* send application logs if the application logs are directed to the container's stdout/stderr. The approach involving a separate Splunk Enterprise container solely for forwarding is overly complex and not typically recommended. A UF can forward directly or via a standard Heavy Forwarder infrastructure. If you are running in Kubernetes, consider using Splunk Connect for Kubernetes, which streamlines log collection using the OpenTelemetry Collector. Using HEC for sending data to Splunk Cloud. See Splunk Lantern: Getting Data In - Best Practices for Getting Data into Splunk Did this answer help you? If so, please consider: Adding kudos to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @viren1990  This does sound an odd situation, as you say if one of the endpoints works then I would expect the others too aswell. Would you be able to share some of the Python code you are using ... See more...
Hi @viren1990  This does sound an odd situation, as you say if one of the endpoints works then I would expect the others too aswell. Would you be able to share some of the Python code you are using for the connection? The other thing that comes to mind is if there is a firewall / proxy server between your server and your outbound connection to the internet? If so there is a chance that this is letting the first request through but the others are blocked.   Did this answer help you? If so, please consider: Adding kudos to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
If all you want is to remove those extra messageID's, you can simply remove those with null request_time, like | search request_time = *
In addition to everybody's speculations, the biggest problem in the SPL in my opinion is that the whole search will only return one field: User; the entire exercise/homework is to simply restrict whi... See more...
In addition to everybody's speculations, the biggest problem in the SPL in my opinion is that the whole search will only return one field: User; the entire exercise/homework is to simply restrict which User values are allowed.  No inner join or stats is needed for this task because plain old subsearch is designed for this. There are a million ways to do this.  Given the original SPL lavishes dedup with rest command outputs, I assume that 12k_line.csv is the largest dataset.  So, I am using that as the lead search. (Any command can be used as lead search; corresponding subsearches justed need to be adjusted.) | inputlookup 12k_line.csv where [rest /services/authentication/users splunk_server=local | search type=SAML | fields title | rename title AS User] [rest /servicesNS/-/-/directory | fields author | dedup author | rename author AS User ] | fields User  
If I read your context correctly, you want to use values of "name" in parameters as key, and those of "value" as value, like the following based on your sample data. storedProcedureName Document... See more...
If I read your context correctly, you want to use values of "name" in parameters as key, and those of "value" as value, like the following based on your sample data. storedProcedureName DocumentFileTypeId DocumentId DocumentTypeId DocumentVersionId IncludeInactive RETURN_VALUE DocumentFileTypeGetById 7         0 DocumentAttributeGetByDocumentTypeId     00   false 0 DocumentDetailGetByParentId   000000       0 DocumentStatusHistoryGetByFK       000000   0 DocumentVersionGetByFK   000000       0 DocumentLinkGetByFK   000000       0 DocumentGetById   000000       0 DocumentFileTypeGetById 7         0 DocumentStatusHistoryGetByFK       000000   0 DocumentVersionGetByFK   000000       0 DocumentLinkGetByFK   000000       0 DocumentGetById   000000       0 Here, I preserved storedProcedureName as reference.  Also note that when you sanitize sample data, any fake value with multiple zeros (0s) must be quoted in order to be valid JSON. To return the above, use JSON functions introduced in 8.1: | eval kvparams = json_object() | foreach parameters mode=json_array [eval kvparams = json_set(kvparams, json_extract(<<ITEM>>, "name"), json_extract(<<ITEM>>, "value"))] | spath input=kvparams | rename @* as * Here is a full emulation using the 12 events (with corrected JSON syntax) for you to play with and compare with real data. | makeresults format=json data="[ {\"auditResultSets\":null,\"schema\":\"ref\",\"storedProcedureName\":\"DocumentFileTypeGetById\",\"commandText\":\"ref.DocumentFileTypeGetById\",\"Locking\":null,\"commandType\":4,\"parameters\":[{\"name\":\"@RETURN_VALUE\",\"value\":0},{\"name\":\"@DocumentFileTypeId\",\"value\":7}],\"serverIPAddress\":\"000.000.000.000\",\"serverHost\":\"Webserver\",\"clientIPAddress\":\"000.000.000.000\",\"sourceSystem\":\"WebSite\",\"module\":\"Vendor.Product.BLL.DocumentManagement\",\"accessDate\":\"2025-03-21T16:37:14.8614186-06:00\",\"userId\":\"0000\",\"userName\":\"username\",\"traceInformation\":[{\"type\":\"Page\",\"class\":\"Vendor.Product.Web.UI.Website.DocumentManagement.ViewDocument\",\"method\":\"Page_Load\"},{\"type\":\"Manager\",\"class\":\"Vendor.Product.BLL.DocumentManagement.DocumentManager\",\"method\":\"Get\"}]}, {\"auditResultSets\":null,\"schema\":\"ref\",\"storedProcedureName\":\"DocumentAttributeGetByDocumentTypeId\",\"commandText\":\"ref.DocumentAttributeGetByDocumentTypeId\",\"Locking\":null,\"commandType\":4,\"parameters\":[{\"name\":\"@RETURN_VALUE\",\"value\":0},{\"name\":\"@DocumentTypeId\",\"value\":\"00\"},{\"name\":\"@IncludeInactive\",\"value\":false}],\"serverIPAddress\":\"000.000.000.000\",\"serverHost\":\"Webserver\",\"clientIPAddress\":\"000.000.000.000\",\"sourceSystem\":\"WebSite\",\"module\":\"Vendor.Product.BLL.DocumentManagement\",\"accessDate\":\"2025-03-21T16:37:14.8614186-06:00\",\"userId\":\"0000\",\"userName\":\"username\",\"traceInformation\":[{\"type\":\"Page\",\"class\":\"Vendor.Product.Web.UI.Website.DocumentManagement.ViewDocument\",\"method\":\"Page_Load\"},{\"type\":\"Manager\",\"class\":\"Vendor.Product.BLL.DocumentManagement.DocumentManager\",\"method\":\"Get\"}]}, {\"auditResultSets\":null,\"schema\":\"ref\",\"storedProcedureName\":\"DocumentDetailGetByParentId\",\"commandText\":\"ref.DocumentDetailGetByParentId\",\"Locking\":null,\"commandType\":4,\"parameters\":[{\"name\":\"@RETURN_VALUE\",\"value\":0},{\"name\":\"@DocumentId\",\"value\":\"000000\"}],\"serverIPAddress\":\"000.000.000.000\",\"serverHost\":\"Webserver\",\"clientIPAddress\":\"000.000.000.000\",\"sourceSystem\":\"WebSite\",\"module\":\"Vendor.Product.BLL.DocumentManagement\",\"accessDate\":\"2025-03-21T16:37:14.8614186-06:00\",\"userId\":\"0000\",\"userName\":\"username\",\"traceInformation\":[{\"type\":\"Page\",\"class\":\"Vendor.Product.Web.UI.Website.DocumentManagement.ViewDocument\",\"method\":\"Page_Load\"},{\"type\":\"Manager\",\"class\":\"Vendor.Product.BLL.DocumentManagement.DocumentManager\",\"method\":\"Get\"}]}, {\"auditResultSets\":null,\"schema\":\"ref\",\"storedProcedureName\":\"DocumentStatusHistoryGetByFK\",\"commandText\":\"ref.DocumentStatusHistoryGetByFK\",\"Locking\":null,\"commandType\":4,\"parameters\":[{\"name\":\"@RETURN_VALUE\",\"value\":0},{\"name\":\"@DocumentVersionId\",\"value\":\"000000\"},{\"name\":\"@IncludeInactive\",\"value\":\"\"}],\"serverIPAddress\":\"000.000.000.000\",\"serverHost\":\"Webserver\",\"clientIPAddress\":\"000.000.000.000\",\"sourceSystem\":\"WebSite\",\"module\":\"Vendor.Product.BLL.DocumentManagement\",\"accessDate\":\"2025-03-21T16:37:14.8614186-06:00\",\"userId\":\"0000\",\"userName\":\"username\",\"traceInformation\":[{\"type\":\"Page\",\"class\":\"Vendor.Product.Web.UI.Website.DocumentManagement.ViewDocument\",\"method\":\"Page_Load\"},{\"type\":\"Manager\",\"class\":\"Vendor.Product.BLL.DocumentManagement.DocumentManager\",\"method\":\"Get\"}]}, {\"auditResultSets\":null,\"schema\":\"ref\",\"storedProcedureName\":\"DocumentVersionGetByFK\",\"commandText\":\"ref.DocumentVersionGetByFK\",\"Locking\":null,\"commandType\":4,\"parameters\":[{\"name\":\"@RETURN_VALUE\",\"value\":0},{\"name\":\"@DocumentId\",\"value\":\"000000\"}],\"serverIPAddress\":\"000.000.000.000\",\"serverHost\":\"Webserver\",\"clientIPAddress\":\"000.000.000.000\",\"sourceSystem\":\"WebSite\",\"module\":\"Vendor.Product.BLL.DocumentManagement\",\"accessDate\":\"2025-03-21T16:37:14.8614186-06:00\",\"userId\":\"0000\",\"userName\":\"username\",\"traceInformation\":[{\"type\":\"Page\",\"class\":\"Vendor.Product.Web.UI.Website.DocumentManagement.ViewDocument\",\"method\":\"Page_Load\"},{\"type\":\"Manager\",\"class\":\"Vendor.Product.BLL.DocumentManagement.DocumentManager\",\"method\":\"Get\"}]}, {\"auditResultSets\":null,\"schema\":\"ref\",\"storedProcedureName\":\"DocumentLinkGetByFK\",\"commandText\":\"ref.DocumentLinkGetByFK\",\"Locking\":null,\"commandType\":4,\"parameters\":[{\"name\":\"@RETURN_VALUE\",\"value\":0},{\"name\":\"@DocumentId\",\"value\":\"000000\"}],\"serverIPAddress\":\"000.000.000.000\",\"serverHost\":\"Webserver\",\"clientIPAddress\":\"000.000.000.000\",\"sourceSystem\":\"WebSite\",\"module\":\"Vendor.Product.BLL.DocumentManagement\",\"accessDate\":\"2025-03-21T16:37:14.8614186-06:00\",\"userId\":\"0000\",\"userName\":\"username\",\"traceInformation\":[{\"type\":\"Page\",\"class\":\"Vendor.Product.Web.UI.Website.DocumentManagement.ViewDocument\",\"method\":\"Page_Load\"},{\"type\":\"Manager\",\"class\":\"Vendor.Product.BLL.DocumentManagement.DocumentManager\",\"method\":\"Get\"}]}, {\"auditResultSets\":null,\"schema\":\"ref\",\"storedProcedureName\":\"DocumentGetById\",\"commandText\":\"ref.DocumentGetById\",\"Locking\":null,\"commandType\":4,\"parameters\":[{\"name\":\"@RETURN_VALUE\",\"value\":0},{\"name\":\"@DocumentId\",\"value\":\"000000\"}],\"serverIPAddress\":\"000.000.000.000\",\"serverHost\":\"Webserver\",\"clientIPAddress\":\"000.000.000.000\",\"sourceSystem\":\"WebSite\",\"module\":\"Vendor.Product.BLL.DocumentManagement\",\"accessDate\":\"2025-03-21T16:37:14.8457543-06:00\",\"userId\":\"0000\",\"userName\":\"username\",\"traceInformation\":[{\"type\":\"Page\",\"class\":\"Vendor.Product.Web.UI.Website.DocumentManagement.ViewDocument\",\"method\":\"Page_Load\"},{\"type\":\"Manager\",\"class\":\"Vendor.Product.BLL.DocumentManagement.DocumentManager\",\"method\":\"Get\"}]}, {\"auditResultSets\":null,\"schema\":\"ref\",\"storedProcedureName\":\"DocumentFileTypeGetById\",\"commandText\":\"ref.DocumentFileTypeGetById\",\"Locking\":null,\"commandType\":4,\"parameters\":[{\"name\":\"@RETURN_VALUE\",\"value\":0},{\"name\":\"@DocumentFileTypeId\",\"value\":7}],\"serverIPAddress\":\"000.000.000.000\",\"serverHost\":\"Webserver\",\"clientIPAddress\":\"000.000.000.000\",\"sourceSystem\":\"WebSite\",\"module\":\"Vendor.Product.BLL.DocumentManagement\",\"accessDate\":\"2025-03-21T16:37:14.736377-06:00\",\"userId\":\"0000\",\"userName\":\"username\",\"traceInformation\":[{\"type\":\"Page\",\"class\":\"Vendor.Product.Web.UI.Website.DocumentManagement.DocumentManagementMain\",\"method\":\"ViewDocument\"},{\"type\":\"Manager\",\"class\":\"Vendor.Product.BLL.DocumentManagement.DocumentManager\",\"method\":\"GetLatestDocumentwithoutAttributes\"}]}, {\"auditResultSets\":null,\"schema\":\"ref\",\"storedProcedureName\":\"DocumentStatusHistoryGetByFK\",\"commandText\":\"ref.DocumentStatusHistoryGetByFK\",\"Locking\":null,\"commandType\":4,\"parameters\":[{\"name\":\"@RETURN_VALUE\",\"value\":0},{\"name\":\"@DocumentVersionId\",\"value\":\"000000\"},{\"name\":\"@IncludeInactive\",\"value\":\"\"}],\"serverIPAddress\":\"000.000.000.000\",\"serverHost\":\"Webserver\",\"clientIPAddress\":\"000.000.000.000\",\"sourceSystem\":\"WebSite\",\"module\":\"Vendor.Product.BLL.DocumentManagement\",\"accessDate\":\"2025-03-21T16:37:14.736377-06:00\",\"userId\":\"0000\",\"userName\":\"username\",\"traceInformation\":[{\"type\":\"Page\",\"class\":\"Vendor.Product.Web.UI.Website.DocumentManagement.DocumentManagementMain\",\"method\":\"ViewDocument\"},{\"type\":\"Manager\",\"class\":\"Vendor.Product.BLL.DocumentManagement.DocumentManager\",\"method\":\"GetLatestDocumentwithoutAttributes\"}]}, {\"auditResultSets\":null,\"schema\":\"ref\",\"storedProcedureName\":\"DocumentVersionGetByFK\",\"commandText\":\"ref.DocumentVersionGetByFK\",\"Locking\":null,\"commandType\":4,\"parameters\":[{\"name\":\"@RETURN_VALUE\",\"value\":0},{\"name\":\"@DocumentId\",\"value\":\"000000\"}],\"serverIPAddress\":\"000.000.000.000\",\"serverHost\":\"Webserver\",\"clientIPAddress\":\"000.000.000.000\",\"sourceSystem\":\"WebSite\",\"module\":\"Vendor.Product.BLL.DocumentManagement\",\"accessDate\":\"2025-03-21T16:37:14.736377-06:00\",\"userId\":\"0000\",\"userName\":\"username\",\"traceInformation\":[{\"type\":\"Page\",\"class\":\"Vendor.Product.Web.UI.Website.DocumentManagement.DocumentManagementMain\",\"method\":\"ViewDocument\"},{\"type\":\"Manager\",\"class\":\"Vendor.Product.BLL.DocumentManagement.DocumentManager\",\"method\":\"GetLatestDocumentwithoutAttributes\"}]}, {\"auditResultSets\":null,\"schema\":\"ref\",\"storedProcedureName\":\"DocumentLinkGetByFK\",\"commandText\":\"ref.DocumentLinkGetByFK\",\"Locking\":null,\"commandType\":4,\"parameters\":[{\"name\":\"@RETURN_VALUE\",\"value\":0},{\"name\":\"@DocumentId\",\"value\":\"000000\"}],\"serverIPAddress\":\"000.000.000.000\",\"serverHost\":\"Webserver\",\"clientIPAddress\":\"000.000.000.000\",\"sourceSystem\":\"WebSite\",\"module\":\"Vendor.Product.BLL.DocumentManagement\",\"accessDate\":\"2025-03-21T16:37:14.736377-06:00\",\"userId\":\"0000\",\"userName\":\"username\",\"traceInformation\":[{\"type\":\"Page\",\"class\":\"Vendor.Product.Web.UI.Website.DocumentManagement.DocumentManagementMain\",\"method\":\"ViewDocument\"},{\"type\":\"Manager\",\"class\":\"Vendor.Product.BLL.DocumentManagement.DocumentManager\",\"method\":\"GetLatestDocumentwithoutAttributes\"}]}, {\"auditResultSets\":null,\"schema\":\"ref\",\"storedProcedureName\":\"DocumentGetById\",\"commandText\":\"ref.DocumentGetById\",\"Locking\":null,\"commandType\":4,\"parameters\":[{\"name\":\"@RETURN_VALUE\",\"value\":0},{\"name\":\"@DocumentId\",\"value\":\"000000\"}],\"serverIPAddress\":\"000.000.000.000\",\"serverHost\":\"Webserver\",\"clientIPAddress\":\"000.000.000.000\",\"sourceSystem\":\"WebSite\",\"module\":\"Vendor.Product.BLL.DocumentManagement\",\"accessDate\":\"2025-03-21T16:37:14.736377-06:00\",\"userId\":\"0000\",\"userName\":\"username\",\"traceInformation\":[{\"type\":\"Page\",\"class\":\"Vendor.Product.Web.UI.Website.DocumentManagement.DocumentManagementMain\",\"method\":\"ViewDocument\"},{\"type\":\"Manager\",\"class\":\"Vendor.Product.BLL.DocumentManagement.DocumentManager\",\"method\":\"GetLatestDocumentwithoutAttributes\"}]} ]" | fields parameters storedProcedureName | eval kvparams = json_object() | foreach parameters mode=json_array [eval kvparams = json_set(kvparams, json_extract(<<ITEM>>, "name"), json_extract(<<ITEM>>, "value"))] | spath input=kvparams | rename @* as * | fields - _* parameters kvparams  
@tawm_12  We recently completed an integration for one of our customers using the following links:   https://stackoverflow.com/questions/53287922/how-to-forward-application-logs-to-splunk-from-d... See more...
@tawm_12  We recently completed an integration for one of our customers using the following links:   https://stackoverflow.com/questions/53287922/how-to-forward-application-logs-to-splunk-from-docker-container  https://www.splunk.com/en_us/blog/tips-and-tricks/splunk-logging-driver-for-docker.html?_gl=1*1tdlq7k*_gcl_aw*R0NMLjE3NDM1NDMzNzEuQ2owS0NRanduYTZfQmhDYkFSSXNBTElkMlowWjRydUlmV053Nl92U2xyZllHSzdFNTRVeERsbzVTVGJNd2RCaTBqNVZZcERMNERPLVZ0MGFBc1lHRUFMd193Y0I.*_gcl_au*MTY2NzExOTYxOS4xNzQxOTY4Mjky*FPAU*MTY2NzExOTYxOS4xNzQxOTY4Mjky*_ga*MTExNTc3MzM4LjE3NDE5NjgyOTM.*_ga_5EPM2P39FV*MTc0MzY0ODcxMy4yNC4xLjE3NDM2NTAzMDQuMC4wLjMyMDQzNDc5*_fplc*UHpUbnZQWUtQVktCZDF4NUlEb2hibWs1Mm50ZGk0bGpJMDVaVG82Q2huUUNpYXJmRFo4WCUyQk5xeU1IYmE4UzNqM1M0SEx6bG8waFN6S1ZGN2dPenI4dDhZNGltbzRoVVNVMVNteDdEbG5EY29XazhJMzc1S3piNmdsbTFDa2clM0QlM0Q.&locale=en_us 
@4SplunkUser  You need to install the Splunk Add-on for vCenter Logs (specifically the Splunk_TA_vcenter package) on your search head if you want the search-time field extractions to work correctly.... See more...
@4SplunkUser  You need to install the Splunk Add-on for vCenter Logs (specifically the Splunk_TA_vcenter package) on your search head if you want the search-time field extractions to work correctly. This ensures that when you search the vCenter log data in Splunk, the fields (e.g., event types, timestamps, etc.) are properly parsed and displayed.   You can install the Splunk Add-on for vCenter Logs (Splunk_TA_vcenter) on a Heavy Forwarder (HF), and in some cases, it makes a lot of sense depending on your Splunk architecture.    The add-on has both index-time (e.g., line breaking, timestamp recognition) and search-time (e.g., field extractions) components. Installing it on the HF ensures index-time processing happens there, which can reduce load on indexers. However, you’ll still need it on the search head for search-time fields.   I can see that the add-on is capable of parsing data for the following sourcetypes:-   vmware:vclog:vpxd vmware:vclog:vpxd-alert vmware:vclog:vpxd-profiler vmware:vclog:vws vmware:vclog:cim-diag vmware:vclog:stats Ingest vCenter Logs to Splunk:- Configure ESXi/vCenter to send logs to a syslog receiver (UF/HF). Use the Splunk Add-on on that receiver to parse those logs. Ensure the add-on is also installed on the HF/Search Head as per your environment.  NOTE: Ensure that your logs align with the expected sourcetypes defined in the props.conf and transforms.conf configurations.  
Splunk Add-on for vCenter Logs does not have anything under the installation tab. Do we just need to install it on the serch head for the vCenter logs to be interpreted correctly or is it something ... See more...
Splunk Add-on for vCenter Logs does not have anything under the installation tab. Do we just need to install it on the serch head for the vCenter logs to be interpreted correctly or is it something that can be used to get the log into Splunk via API calls? Better documentation would be great as it is a Splunk supported app. 
I have two searches  and I only want to find rows which has common MessageID . Currently it is returning extra row because of second search .  Query before Or is returning 100 records  and after OR ... See more...
I have two searches  and I only want to find rows which has common MessageID . Currently it is returning extra row because of second search .  Query before Or is returning 100 records  and after OR one was returning 110 rows  and for those extra 10 rows messageID in first is NULL , So I want to drop those messages . Please help how can i  change this query to make it work . I am trying to find count of matched IDs and  list of all such ids  ```query for apigateway call``` (index=aws_np earliest="03/28/2025:13:30:00" latest="03/28/2025:14:35:00" Method response body after transformations: sourcetype="aws:apigateway" business_unit=XX aws_account_alias ="XXXX" network_environment=xxXXX source="API-Gateway-Execution-Logs*" (application="xXXXXX" OR application="xXXXX-xXX") | rex field=_raw "Method response body after transformations: (?<json>[^$]+)" | spath input=json path="header.messageID" output=messageID | spath input=json path="payload.statusType.code" output=status | spath input=json path="payload.statusType.text" output=text | spath input=json path="header.action" output=action | where status=200 and action="Create" ` | rename _time as request_time | table messageID, request_time) | append ```query for 2nd query call``` [ search kubernetes_cluster="eks-XXX*" index="aws_XXX" sourcetype = "kubernetes_logs" source = *XXXX* "sendData" | rex field=_raw "sendData: (?<json>[^$]+)" | spath input=json path="header.messageID" output=messageID | rename _time as pubsub_time | table messageID, pubsub_time ] | stats values(request_time) as request_time values(pubsub_time) as pubsub_time by messageID      
Hi everyone, I'm seeking advice on the best way to send application logs from our client's Docker containers into a Splunk Cloud instance, and I’d appreciate your input and experiences. Currently, ... See more...
Hi everyone, I'm seeking advice on the best way to send application logs from our client's Docker containers into a Splunk Cloud instance, and I’d appreciate your input and experiences. Currently, my leading approach involves using Docker’s "Splunk logging driver" to forward data via the HEC. However, my understanding is that this method primarily sends container-level data rather than detailed application logs. Another method I came across involves deploying Splunk's Docker image to create a standalone Enterprise container alongside the Universal Forwarder. The idea here is to set up monitors in the forwarder's inputs.conf to send data to the Enterprise instance and then route it via a Heavy Forwarder to Splunk Cloud. Has anyone successfully implemented either of these approaches—or perhaps a different method—to ingest application logs from Docker containers into Splunk Cloud? Any insights, tips, or shared experiences would be greatly appreciated. Thanks in advance for your help! Cheers,
Like the others, not totally sure what you're after, but given your example data, this SPL | stats latest(version) as last_version max(_time) as last_time count(version) as count_version dc(version)... See more...
Like the others, not totally sure what you're after, but given your example data, this SPL | stats latest(version) as last_version max(_time) as last_time count(version) as count_version dc(version) as dc_version by hostname model system will tell you the count of version records, the last version/date, the distinct version count for each host/system/model However, if you want to detect "NEWER" vs going backwards in versions, you'll need to define that rule in the version data. Also, this will only tell you within the search range, how far back do you want to go?  
It is not clear what you are expecting your result to be. You mention devices but these are not mentioned in your data. Since each event appears to represent a different version, can you not just cou... See more...
It is not clear what you are expecting your result to be. You mention devices but these are not mentioned in your data. Since each event appears to represent a different version, can you not just count the events? Please clarify what you are trying to do, what more of your data looks like and what your expected results would be.
Do you want to explain what you are meaning?
If I have understood right Django is no longer supported on recent spunk versions https://docs.splunk.com/Documentation/Splunk/9.4.0/Installation/ChangesforSplunkappdevelopers ?
unnecessary comment.  expect better from a Trust member
Hi It's obviously that without replication=true it's only in SH side and indexers cannot use is. Can you tell more about that later error report? r. Ismo
Hello Team - I have a strange use case wherein while invoking Splunk cloud REST APIs via Python SDK , only for one endpoint /services/apps/local I am receiving 200 response however for any other endp... See more...
Hello Team - I have a strange use case wherein while invoking Splunk cloud REST APIs via Python SDK , only for one endpoint /services/apps/local I am receiving 200 response however for any other endpoint such as /services/server/info or /services/search/jobs - I get connection timeout.  While debugging I approached Splunk's internal logs (using index = _internal),  I found that for the request made through client I see an entry in access logs with 200/201 http code but not sure why would it result into connection time out[Err 110] as if the client kept on waiting to receive the response from server and at the end gave up. I tried increasing timeout value on client side as well yet no luck   I don't think reachability is an issue here as /services/apps/local endpoint on 8089 port is accessible and for other endpoints too , there are log traces on Splunk cloud side as aforesaid so what could be an issue here ?  Search query is also extremely simple -  search index=_internal | stats count by sourcetype   Please help. 
In cluster you should also change internals to auto! Otherwise splunk don't replicate those buckets!
I think that this is a duplicate questions? But here is one old post which could help you https://community.splunk.com/t5/Installation/EC2-from-AMI-having-splunk-installed-stops-working/m-p/669633#M1... See more...
I think that this is a duplicate questions? But here is one old post which could help you https://community.splunk.com/t5/Installation/EC2-from-AMI-having-splunk-installed-stops-working/m-p/669633#M13418
Probably the best way to do this is create a lookup file/kvstore collection where you are stored current/earlier versions. Then just create a SPL query which creates current status, then checks what a... See more...
Probably the best way to do this is create a lookup file/kvstore collection where you are stored current/earlier versions. Then just create a SPL query which creates current status, then checks what are differences between those two. If you need exact SPL then please share what you have currently with some sample data with masked identifiers.