All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@isoutamo  I also presumed the 12k_line.csv would have > 10,000 events (probably 12,000!). I dont think this should be an issue here though as append supports 50,000 events by default? @hank72 Plea... See more...
@isoutamo  I also presumed the 12k_line.csv would have > 10,000 events (probably 12,000!). I dont think this should be an issue here though as append supports 50,000 events by default? @hank72 Please let us know if you have any trouble with the provided search or if I've got the wrong end of your requirements. Did this answer help you? If so, please consider: Adding kudos to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing.  
Hi @isoutamo  I'm a little confused here as I was under the impression UFs were pretty stateless. They dont have Python, KVStore and do not locally index data? Compared to HF or other full Splunk En... See more...
Hi @isoutamo  I'm a little confused here as I was under the impression UFs were pretty stateless. They dont have Python, KVStore and do not locally index data? Compared to HF or other full Splunk Enterprise instances which definitely need to be updated to specific versions incrementally. I've updated countless UFs From 7->9 without issue but happy to update my previous post if needed.  Looking at the remote UF updater (https://docs.splunk.com/Documentation/Forwarder/1.0.0/ForwarderRemoteUpgradeLinux/Supporteduniversalforwarderversions) This supports a minimum version 8.0.0 and upgrades directly to 9.x so I am content that this is feasible. I know that for non-UF hosts there is a pretty strict upgrade path,  
You could run on SH's command line  /opt/splunk/bin/splunk package app <app name> to merge and export app. 
Whoops! Sorry, read that too quickly as 2025  I see V2.0.2 previously had cloud compatibility so presume it was dropped due to not being updated to new versions of Splunk SDK etc. Hopefully they w... See more...
Whoops! Sorry, read that too quickly as 2025  I see V2.0.2 previously had cloud compatibility so presume it was dropped due to not being updated to new versions of Splunk SDK etc. Hopefully they will get back to you and someone will update it soon    
Actually You cannot update it directly from old to new unless it match those restrictions which are defined for for splunk servers too! Usually this means that you can jump over one version like 7.3.... See more...
Actually You cannot update it directly from old to new unless it match those restrictions which are defined for for splunk servers too! Usually this means that you can jump over one version like 7.3.x -> 8.1.x -> 9.0.x -> 9.2.x -> 9.4.x. Also you must start UF on each steps for updating e.g. fishbucket DB and other things which has changed between versions and need some internal updates. Of course you could remove old UF installation and install the newest versions from scratch into it. But then you need remember that this means: You lost your UF's GUID => you will get a new UF into sever point of view. Of course you can use same GUIF in .../etc/instance.cfg and keep old UF information in sever side splunk.secret will change which means that if you have any secrets/passwords in your old configurations and you try to use those you need to use plain versions and give UF crypt those again You lost information where your inputs are as you lost fishbucketdb which keep track of those => are are reingesting all files again which you have in this node  Maybe something else which I forgot? I know that updating from some version to another version could work without issues, but not for all. And those issues could arise later on, not immediately after you start a new version. I also strongly recommend you to use OS's native sw packages instead of use tar versions. With this way it's much easier manage your OS level information as you could trust your package management sw information.
I'm ingesting data into Splunk via the HTTP Event Collector (HEC), but the data is wrapped inside a "data" key instead of "event". Splunk expects events inside the "event" key, and I'm getting the er... See more...
I'm ingesting data into Splunk via the HTTP Event Collector (HEC), but the data is wrapped inside a "data" key instead of "event". Splunk expects events inside the "event" key, and I'm getting the error:   Failed to send data: {"text":"No data","code":5}   Here’s an example of the data I’m sending:  { "data": { "timestamp": "2025-04-01T19:51:07.720Z", "userId": "", "userAgent": "Visual Studio Code/1.98.2 (Continue/1.0.5)", "selectedProfileId": "local", "eventName": "chatFeedback", "schema": "0.2.0", "prompt": "|>\n", "completion": "Sample completion text", "modelTitle": "Llama", "feedback": true, "sessionId": "c36c18eb-25e6-4448-b9b5-a50cdd2a0baa" } index="test" sourcetype="test:json" source="telemetry" } How can I transform incoming HEC data so that "data" is treated as "event" in Splunk? Is there a better way to handle this at the Splunk configuration level? Thanks in advance for any help! @ITWhisperer
You need also replace some macros which are defined on CMC apps or try to change those permissions to global to run those on another apps. Anyhow you must check and update those on every time SCP has... See more...
You need also replace some macros which are defined on CMC apps or try to change those permissions to global to run those on another apps. Anyhow you must check and update those on every time SCP has updated to ensure that those are still working. Probably it's easiest to define what you want to see/show on another app and create those queries and panels based on those CMC's queries without macros and other CMC only KOs.
One additional comment. You don't need to extract a new version into temp folder. You can/should extract it directly into current forwarder folder. If I recall correctly there is no need to update ... See more...
One additional comment. You don't need to extract a new version into temp folder. You can/should extract it directly into current forwarder folder. If I recall correctly there is no need to update AIX's subsystem with new binary? It should work with current settings and start UF after restart. If not then check correct steps from installation / admin manual.
I prefer to use this command in powershell msiexec.exe /i <path to temp>/splunkforwarder-<XXXXX>-x64-release.msi AGREETOLICENSE=yes LAUNCHSPLUNK=no SERVICESTARTTYPE=auto /quiet /l*v install-log.txt ... See more...
I prefer to use this command in powershell msiexec.exe /i <path to temp>/splunkforwarder-<XXXXX>-x64-release.msi AGREETOLICENSE=yes LAUNCHSPLUNK=no SERVICESTARTTYPE=auto /quiet /l*v install-log.txt This /l*v install-log.txt gives you more verbose  file with log messages which help you to solve issues. Then I have separate apps which contains DS definitions and another for outputs.conf definition. Both those are also in DS and can be updated later on with DS. After I have added those two apps, then I start SplunkFowarder service.
I agree with @yuanliu that you should tell more to us to get answers.  Am I right if I assume that this lookup "12k_line.csv" contains more than 10k lines?
Have you try different Push modes? See https://docs.splunk.com/Documentation/Splunk/9.2.1/DistSearch/PropagateSHCconfigurationchanges#Choose_a_deployer_push_mode You could set this separately for ea... See more...
Have you try different Push modes? See https://docs.splunk.com/Documentation/Splunk/9.2.1/DistSearch/PropagateSHCconfigurationchanges#Choose_a_deployer_push_mode You could set this separately for each app in it's app.conf file or globally in .../etc/system/local/app.conf
Hi I think that you should create e.g. a csv lookup file which contains something like job, realtive day, start time, end time job1, 0, 22:00, 00:00 job2, 1, 03:00, 05:00 job3, 1, 06:00, 08:00  Ma... See more...
Hi I think that you should create e.g. a csv lookup file which contains something like job, realtive day, start time, end time job1, 0, 22:00, 00:00 job2, 1, 03:00, 05:00 job3, 1, 06:00, 08:00  Maybe some other fields if/as needed. Then use those values and _time from indexed data from ran job's log. Also relative_time to adjust/check time in past and future. r. Ismo
Hi I don't think that you can do this with core splunk. But if you are using classic dashboards then maybe this is doable with javascript? And there is also react ui (see splunkui.splunk.com) which ... See more...
Hi I don't think that you can do this with core splunk. But if you are using classic dashboards then maybe this is doable with javascript? And there is also react ui (see splunkui.splunk.com) which can help you with it? Maybe someone else who are more familiar with those and could help you? Also this app maybe help you https://splunkbase.splunk.com/app/6859 ? r. Ismo
You should try to find another events which contains this bid from your _internal log. Those probably give you some hints what cause this error message.
Your Sample logs.txt didn't work with your query. I suppose that it's not a real sample data what you have gotten from index=.... query! Also there is one lookup which content is unknown.
For point 2, yes , if from UI like alert configuration screen, mention the ServiceNow table name in the 'endpoint' . if from a custom search, along with minimum parameters Account and Correlation_ID ... See more...
For point 2, yes , if from UI like alert configuration screen, mention the ServiceNow table name in the 'endpoint' . if from a custom search, along with minimum parameters Account and Correlation_ID , add 'scripted_endpoint' , e.g.  | eval scripted_endpoint="/api/now/table/xxxxxx"  Refer Splunk documentation: Commands, alert actions, and scripts - Splunk Add-on for ServiceNow
Hi @livehybrid  On Splunkbase Latest Version 2.0.2 has a March 18, 2024 date which was over a year ago. So   looking to see what other avenues can can be taken tried contacting  the developer email l... See more...
Hi @livehybrid  On Splunkbase Latest Version 2.0.2 has a March 18, 2024 date which was over a year ago. So   looking to see what other avenues can can be taken tried contacting  the developer email listed  gsa-request@splunk.com but have not had any response back 
Hi @lbaity  As the latest version (2.0.2) was only uploaded a couple of weeks ago there is a reasonable chance it is still going the the internal App Review vetting for Cloud apps which can take up ... See more...
Hi @lbaity  As the latest version (2.0.2) was only uploaded a couple of weeks ago there is a reasonable chance it is still going the the internal App Review vetting for Cloud apps which can take up to 4 weeks. The older version (2.0.1) has likely lost its Splunk Cloud compatibility approval due to the age of the version.  If it hasnt been approved in a couple of weeks then I would suggest chasing internally with the AppInspect team.   Did this answer help you? If so, please consider: Adding kudos to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing.
Would it be possible to get Slack add-on for Splunk (4986) ver 2.0.2  submitted to be cloud vetted and be compatible for cloud customers? If so, I would like to make the request. The current compati... See more...
Would it be possible to get Slack add-on for Splunk (4986) ver 2.0.2  submitted to be cloud vetted and be compatible for cloud customers? If so, I would like to make the request. The current compatibility for the app as listed on Splunkbase only lists Splunk Enterprise. As such,  it does not show up as a viable app to be installed for cloud customers and Splunk support cannot install the app as is not validated to work on Splunk cloud stacks. https://splunkbase.splunk.com/app/4986.
Thank you @yuanliu It worked