Getting Data In

New Splunk Cloud environment - Initial setup

Path Finder

Hi All,

We have purchased Splunk Cloud recently. We couldn’t send any logs to Splunk Cloud as ports are blocked. Could you please answer for below questions:

  1. How to ingest logs from Universal Forwarders (500 Windows and Linux servers)? What ports should be allowed from UF’s to where (individual Indexer IP in Splunk Cloud (are they static))? Can we directly send logs from UF’s or should we use Gateway server (HF or UF)?
  2. We want to build Deployment Master to manage clients? What ports should be allowed from DM to where (only to UF’s or also to CM, HF)?
  3. We want to ingest syslog from devices. We save sylog data in our syslog (unix) server for few weeks. Can we install UF on that server and send directly to Splunk Cloud or should we direct it via HF?
  4. After installing Universal Forwarder and Credentials package, inputs.conf is present in /default folder. Should we copy the complete inputs.conf and place it in local folder? Or we can create a new app with the inputs.conf and deploy to all clients?
  5. Sending data from UF’s to cloud will be encrypted or will go as clear text?
0 Karma

Splunk Employee
Splunk Employee
  1. UFs should be able to send to the Splunk Cloud indexers on port TCP 9997. There are pro's and con's with using a gateway server, but that requires a better understanding of your environment.
  2. The Splunk endpoints communicate with the Deployment Server on port 8089. You shouldn't need a Cluster Master for your Cloud configuration.
  3. You can install a HF or UF on the syslog server. HF would allow you to filter/discard event messages that you don't want to send to the cloud, where as the UF lacks that ability.
  4. Best practice would be to create a new app that contains the appropriate inputs.conf. This can be pushed to the UFs via the Deployment Server.
  5. It will be encrypted
0 Karma

Path Finder

Thanks for the details.
2. Should the source be Deployment Server and destination is UF's and HF's or require two way?
3. Syslog server will be managed by Server Ops, we want to set any filtering on Splunk HF which Splunk team manage. So, I think I can install UF on syslog server and forward it to HF for any filtering (props)?
4. Sounds good.
5. Any official document mentioning about encryption. It will be helpful to send to Security Operations. As we are sending data through internet, should we enable anything for encryption.

0 Karma

Splunk Employee
Splunk Employee

On #1 - You need to use the Splunk Cloud Forwarder App in addition to the standard Splunk UF bits to ensure the data makes it your Cloud deployment. See this for details- The credentials package ensure data is delivered compressed and encrypted. See this for #5 -

On #2 the Deployment Clients (UFs, IFs and HFs - in your case) phone home to the Deployment Server. the DS setup for Splunk Cloud is same as on-prem if you've had experience with the before. In either case, see this link for more details -

Get Updates on the Splunk Community!

New Splunk Observability innovations: Deeper visibility and smarter alerting to ...

You asked, we delivered. Splunk Observability Cloud has several new innovations giving you deeper visibility ...

Synthetic Monitoring: Not your Grandma’s Polyester! Tech Talk: DevOps Edition

Register today and join TekStream on Tuesday, February 28 at 11am PT/2pm ET for a demonstration of Splunk ...

Instrumenting Java Websocket Messaging

Instrumenting Java Websocket MessagingThis article is a code-based discussion of passing OpenTelemetry trace ...