I am trying to ingest some json data into a new Splunk Cloud instance, with a custom sourcetype, but I keep getting duplicate data in the search results. This seems to be an extremely common problem, based on the number of old posts, but none of them seem to address the Cloud version. I have a JSON file that looks like this: { "RowNumber": 1, "ApplicationName": "177525278", "ClientProcessID": 114889, "DatabaseName": "1539703986", "StartTime": "2024-07-30 12:15:13" } I have a Windows 2022 server with a 9.2.2 universal forwarder installed. I manually added a very simple app to the C:\Program Files\SplunkUniversalForwarder\etc\apps folder. inputs.conf contains this monitor [batch://C:\splunk_test_files\ErrorMaster\*.json] move_policy=sinkhole index=centraladmin_errormaster sourcetype=errormaster props.conf contains this type (copied from _json) [errormaster] pulldown_type = true INDEXED_EXTRACTIONS = json KV_MODE = none category = Structured description = JavaScript Object Notation format. For more information, visit http://json.org/ On the cloud side I created (from the UI) a new sourcetype called 'errormaster' as just a direct clone of the existing _json type. When I add a .json file to the folder, it is ingested and the events show up in the cloud instance, under the right correct centraladmin_errormaster index, and with the sourcetype=errormaster. However, the fields all have duplicate values. If it switch it to the built-in _json type it works fine. I have some field extractions I want to add, which is why I wanted a custom type. I'm guessing this is something obvious to the Cloud experts, but I am an accidental Splunk Admin with very little experience, so any help you can offer would be appreciated.
... View more