All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I'm ingesting data into Splunk via the HTTP Event Collector (HEC), but the data is wrapped inside a "data" key instead of "event". Splunk expects events inside the "event" key, and I'm getting the er... See more...
I'm ingesting data into Splunk via the HTTP Event Collector (HEC), but the data is wrapped inside a "data" key instead of "event". Splunk expects events inside the "event" key, and I'm getting the error:   Failed to send data: {"text":"No data","code":5}   Here’s an example of the data I’m sending:  { "data": { "timestamp": "2025-04-01T19:51:07.720Z", "userId": "", "userAgent": "Visual Studio Code/1.98.2 (Continue/1.0.5)", "selectedProfileId": "local", "eventName": "chatFeedback", "schema": "0.2.0", "prompt": "|>\n", "completion": "Sample completion text", "modelTitle": "Llama", "feedback": true, "sessionId": "c36c18eb-25e6-4448-b9b5-a50cdd2a0baa" } index="test" sourcetype="test:json" source="telemetry" } How can I transform incoming HEC data so that "data" is treated as "event" in Splunk? Is there a better way to handle this at the Splunk configuration level? Thanks in advance for any help! @ITWhisperer
You need also replace some macros which are defined on CMC apps or try to change those permissions to global to run those on another apps. Anyhow you must check and update those on every time SCP has... See more...
You need also replace some macros which are defined on CMC apps or try to change those permissions to global to run those on another apps. Anyhow you must check and update those on every time SCP has updated to ensure that those are still working. Probably it's easiest to define what you want to see/show on another app and create those queries and panels based on those CMC's queries without macros and other CMC only KOs.
One additional comment. You don't need to extract a new version into temp folder. You can/should extract it directly into current forwarder folder. If I recall correctly there is no need to update ... See more...
One additional comment. You don't need to extract a new version into temp folder. You can/should extract it directly into current forwarder folder. If I recall correctly there is no need to update AIX's subsystem with new binary? It should work with current settings and start UF after restart. If not then check correct steps from installation / admin manual.
I prefer to use this command in powershell msiexec.exe /i <path to temp>/splunkforwarder-<XXXXX>-x64-release.msi AGREETOLICENSE=yes LAUNCHSPLUNK=no SERVICESTARTTYPE=auto /quiet /l*v install-log.txt ... See more...
I prefer to use this command in powershell msiexec.exe /i <path to temp>/splunkforwarder-<XXXXX>-x64-release.msi AGREETOLICENSE=yes LAUNCHSPLUNK=no SERVICESTARTTYPE=auto /quiet /l*v install-log.txt This /l*v install-log.txt gives you more verbose  file with log messages which help you to solve issues. Then I have separate apps which contains DS definitions and another for outputs.conf definition. Both those are also in DS and can be updated later on with DS. After I have added those two apps, then I start SplunkFowarder service.
I agree with @yuanliu that you should tell more to us to get answers.  Am I right if I assume that this lookup "12k_line.csv" contains more than 10k lines?
Have you try different Push modes? See https://docs.splunk.com/Documentation/Splunk/9.2.1/DistSearch/PropagateSHCconfigurationchanges#Choose_a_deployer_push_mode You could set this separately for ea... See more...
Have you try different Push modes? See https://docs.splunk.com/Documentation/Splunk/9.2.1/DistSearch/PropagateSHCconfigurationchanges#Choose_a_deployer_push_mode You could set this separately for each app in it's app.conf file or globally in .../etc/system/local/app.conf
Hi I think that you should create e.g. a csv lookup file which contains something like job, realtive day, start time, end time job1, 0, 22:00, 00:00 job2, 1, 03:00, 05:00 job3, 1, 06:00, 08:00  Ma... See more...
Hi I think that you should create e.g. a csv lookup file which contains something like job, realtive day, start time, end time job1, 0, 22:00, 00:00 job2, 1, 03:00, 05:00 job3, 1, 06:00, 08:00  Maybe some other fields if/as needed. Then use those values and _time from indexed data from ran job's log. Also relative_time to adjust/check time in past and future. r. Ismo
Hi I don't think that you can do this with core splunk. But if you are using classic dashboards then maybe this is doable with javascript? And there is also react ui (see splunkui.splunk.com) which ... See more...
Hi I don't think that you can do this with core splunk. But if you are using classic dashboards then maybe this is doable with javascript? And there is also react ui (see splunkui.splunk.com) which can help you with it? Maybe someone else who are more familiar with those and could help you? Also this app maybe help you https://splunkbase.splunk.com/app/6859 ? r. Ismo
You should try to find another events which contains this bid from your _internal log. Those probably give you some hints what cause this error message.
Your Sample logs.txt didn't work with your query. I suppose that it's not a real sample data what you have gotten from index=.... query! Also there is one lookup which content is unknown.
For point 2, yes , if from UI like alert configuration screen, mention the ServiceNow table name in the 'endpoint' . if from a custom search, along with minimum parameters Account and Correlation_ID ... See more...
For point 2, yes , if from UI like alert configuration screen, mention the ServiceNow table name in the 'endpoint' . if from a custom search, along with minimum parameters Account and Correlation_ID , add 'scripted_endpoint' , e.g.  | eval scripted_endpoint="/api/now/table/xxxxxx"  Refer Splunk documentation: Commands, alert actions, and scripts - Splunk Add-on for ServiceNow
Hi @livehybrid  On Splunkbase Latest Version 2.0.2 has a March 18, 2024 date which was over a year ago. So   looking to see what other avenues can can be taken tried contacting  the developer email l... See more...
Hi @livehybrid  On Splunkbase Latest Version 2.0.2 has a March 18, 2024 date which was over a year ago. So   looking to see what other avenues can can be taken tried contacting  the developer email listed  gsa-request@splunk.com but have not had any response back 
Hi @lbaity  As the latest version (2.0.2) was only uploaded a couple of weeks ago there is a reasonable chance it is still going the the internal App Review vetting for Cloud apps which can take up ... See more...
Hi @lbaity  As the latest version (2.0.2) was only uploaded a couple of weeks ago there is a reasonable chance it is still going the the internal App Review vetting for Cloud apps which can take up to 4 weeks. The older version (2.0.1) has likely lost its Splunk Cloud compatibility approval due to the age of the version.  If it hasnt been approved in a couple of weeks then I would suggest chasing internally with the AppInspect team.   Did this answer help you? If so, please consider: Adding kudos to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing.
Would it be possible to get Slack add-on for Splunk (4986) ver 2.0.2  submitted to be cloud vetted and be compatible for cloud customers? If so, I would like to make the request. The current compati... See more...
Would it be possible to get Slack add-on for Splunk (4986) ver 2.0.2  submitted to be cloud vetted and be compatible for cloud customers? If so, I would like to make the request. The current compatibility for the app as listed on Splunkbase only lists Splunk Enterprise. As such,  it does not show up as a viable app to be installed for cloud customers and Splunk support cannot install the app as is not validated to work on Splunk cloud stacks. https://splunkbase.splunk.com/app/4986.
Thank you @yuanliu It worked
I'm getting thousands of log events that says -- ERROR CMSlave [2549383 CMNotifyThread] - Cannot find bid=wineventlog~157~96ECF7C4-1951-4288-B90A-9133E5408F14. cleaning up usage data It is on all m... See more...
I'm getting thousands of log events that says -- ERROR CMSlave [2549383 CMNotifyThread] - Cannot find bid=wineventlog~157~96ECF7C4-1951-4288-B90A-9133E5408F14. cleaning up usage data It is on all my indexers and references multiple but not all indexes.  Any ideas on how to fix that error?
Hi @Real_captain  There isn't a built-in Splunk Web feature for this rotation. I would recommend using a browser extension such as "Tab Rotate" - these often have configurations like the amount of t... See more...
Hi @Real_captain  There isn't a built-in Splunk Web feature for this rotation. I would recommend using a browser extension such as "Tab Rotate" - these often have configurations like the amount of time and rate at which different tabs are rotated/reloaded. Alternatively if you are using Classic dashboards you could write some custom javascript but I think you'll probably have better success, quicker, with an off-the-shelf browser extension.   Did this answer help you? If so, please consider: Adding kudos to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing.
Hi @siv , Yes, you can add a field to an existing KV Store collection without directly editing collections.conf by using the Splunk REST API. See https://dev.splunk.com/enterprise/docs/developapps/... See more...
Hi @siv , Yes, you can add a field to an existing KV Store collection without directly editing collections.conf by using the Splunk REST API. See https://dev.splunk.com/enterprise/docs/developapps/manageknowledge/kvstore/usetherestapitomanagekv/#:~:text=Define%20the%20collection%20schema%3A for more info, but will cover briefly below. You can use curl to add a new field named new_field_name of type string to a collection named my_collection within the my_app app context. You'll need to update the existing field definitions and include the new one in the payload. First, get the current definition (optional but helpful):   curl -k -u admin:yourpassword \ https://<serverName>:8089/servicesNS/nobody/my_app/storage/collections/config/my_collection   Then, POST the updated configuration, including all existing fields plus the new one: curl -k -u admin:yourpassword \ -X POST \ https://<serverName>:8089/servicesNS/nobody/my_app/storage/collections/config/my_collection \ -d 'field.existing_field1=string' \ -d 'field.existing_field2=number' \ -d 'field.new_field_name=string' # Add your new field here This method requires appropriate permissions, specifically the POST / Updating capability, to modify KV Store configurations via the REST API. Using the REST API effectively updates the configuration as if you had edited the collections.conf file, but does so remotely. Documentation: KV Store REST API Endpoints: https://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTkvstore Specifically, the collection endpoint: https://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTkvstore#storage.2Fcollections.2Fconfig.2F.7Bcollection.7D   Did this answer help you? If so, please consider: Adding kudos to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @siv , I don't think so, you could try to upload a file with the additional column, but I'm not sure. ciao. Giuseppe
@kaushik3g @iamarkaprabha please help me on the similar issue- https://community.splunk.com/t5/Getting-Data-In/Akamai-add-on-logs-are-not-populating/m-p/743241#M118086