Getting Data In: Session 2 - Wed 3/29/23

Community Office Hour Cover Images copy.png
Published on ‎02-23-2023 12:08 PM by Community Manager | Updated on ‎06-26-2023 02:43 PM

Register here and ask questions below this thread for the Community Office Hours session on Getting Data In (GDI) to Splunk Platform on Wed, March 29, 2023 at 1pm PT / 4pm ET.


This is your opportunity to ask technical Splunk experts questions related to your specific GDI challenge or use case, like how to onboard common data sources (AWS, Azure, Windows, *nix, etc.), using forwarders, apps to get data in, Data Manager (Splunk Cloud Platform), ingest actions, archiving your data, and anything else you’d like to learn!


There are two 30-minute sessions in this series. You can choose to attend one or both (each session will cover a different set of questions):

Wednesday, March 15th – 1:00 pm PT / 4:00 pm ET

Wednesday, March 29th – 1:00 pm PT / 4:00 pm ET


Please submit your questions below as comments in advance. You can also head to the #office-hours user Slack channel to ask questions (request access here). Pre-submitted questions (with upvotes) will be prioritized. After that, we will go in order of the questions posted below, then will open the floor up to live Q&A with meeting participants. If there’s a quick answer available, we’ll post as a direct reply.


Look forward to connecting!

0 Karma
Splunk Employee

Hey Everyone!

Drop your questions/comments here for any topics you'd like to see discussed in the Community Office Hours session (you can also head to the #office-hours user Slack channel to ask questions and join the discussion - request access here).

Hi, I'm interested in building a health-check dashboard in Splunk for a Pega PRPC (java-based) application.

I would want to ingest data with a) SQL queries and b) parsing the JSON returned by a specific REST call ('ping').

Pega Ping service FAQs | Support Center


Expert solution:

  • Download the Splunkbase App to GDI from SQL DB: Splunk DB Connect
    • Data can be imported or exported
    • Database Lookups (where I’m matching a key to a bunch of values inside of that database returning extra fields)
  • Follow these Docs: GDI from a REST API: Modular Input
    • Protip: Splunk Add-on Builder
    • If you use the builder, there’s a GUI that helps you take a step by step approach of getting an API and pulling values back from that, which would do some of that coding for you on the backend.



Jamf pro logs are in xml and its difficult to search for something specific. As an example, I tried searching for just a specific app such as chrome and I get every apps installed on all the hosts instead of just the one I searched for. Also, once I select a specific host, I'm no longer able to see the apps installed on it and vice-versa, once I click on apps, I'm no longer able to see the host.

I'd appreciate any help in parsing this and been able to search for just specific things. My goal is to be able to search for specific apps installed per host.

Splunk Employee

Expert Solution: 

Use the search command:  xmlkv

  • This will understand the XML structure, extract that out into key value pairs, which you can then type into a stats command, a time chart, whichever reporting command you want it after that.

Roughly what your search will look like: 

index=webapp sourcetype=jamfpro | xmlkv | search app=”<app name>” | stats count by host, app


Another option:

| xpath “//xx/xxxx()” 

| rex  ← Just remember to anchor extractions and minimize wildcards for performance reasons


Another tip: check out this great related article by Splunker Brent Davis. This refers to Ingest Actions but generally a good regex perf article


There are a few more tips that the experts covered in the session (I'll send that slide deck out shortly)!