Splunk Tech Talks
Deep-dives for technical practitioners.

Splunk Cloud Platform Migration Lessons Learned

WhitneySink
Splunk Employee
Splunk Employee

For a smooth migration to Splunk Cloud, there are many technical questions you need to be able to answer. For example, what are your data sources and data flows to Splunk? How do you map these data sources to the Technical Add-ons (TAs)? What is your data quality and index capacity? Which architecture/ingestion tier is best for you? What role-based access do you want post-migration? How do you utilize the Splunk Cloud Migration Application? Our webinar will discuss these questions and provide some lessons learned from our Splunk Cloud Platform migration.

Tune in to learn about:

  • Data sources, workflows, and mapping to TAs
  • Determining the best architecture/ingestion tier for you
  • BlueVoyant Splunk Cloud migration lessons learned
Tags (1)
WhitneySink
Splunk Employee
Splunk Employee

Tune in here to watch this On Demand Tech Talk: 

https://events.splunk.com/Splunk-Cloud-Platform-Migration-Lessons-Learned

We look forward to seeing you!

WhitneySink
Splunk Employee
Splunk Employee

Here is the Q&A from our Tech Talk below.  Please keep the conversation going here!

Q: What if the roles are international, that is, users coming in from around the globe?

A:  Native Splunk is not going to allow you to do role based provisioning based on GeoLoactions. You can do it with an IdP such as Okta or Azure AD. Just remember, Splunk Cloud only allows for 1 IdP so make sure you can leverage federation if you plan a more robust environment.

 

Q: What things did you learn as part of your cloud migration?

A: I found lots of cruft/unused data and knowledge objects. Don't replicate it in Splunk Cloud if at all possible.

 

Q: It would be helpful if you were to point out any aspects of this are relevant if the plan is to move to Hybrid/Federated search vs. using the Cloud Search Head. And perhaps, the pros and cons of each.

A: One thing to note is your Splunk Cloud stack to determine. Classic uses the Hybrid which is going away and Victoria is using Federated search. Where Hybrid was a more seamless SPL experience, Federated requires index configurations and some slight changes to the SPL to use federated searching. Most environment we work in leverages the Federated search when multiple stacks are in place or an Application Development environment is in play. Federated search gives you the CLI access and flexibility to build and develop your application locally and then have it deploy to the cloud via ACS. We will be doing additional talks in the future around CI/CD pipeline and Splunk Cloud which will hit on this topic. All of this is relevant when moving to Splunk Cloud and using the Federated Search component. Your data still needs to be cleaned and managed accordingly. Also ensuring the right applications are being migrated is to the correct location helps in minimizing your risk to outages and breaks.

 

Q: What was the biggest hangup during your Cloud migration?

A: Getting the data in honestly. Make sure to work closely with your Sales reps to establish access. once you have access, install all the TA's in Splunk Cloud and then start moving the feeds over. Splunk Cloud makes TLS configurations easy but depending on the selected architecture, it can be cumbersome. Also - be prepared on the change request front. A lot of changes are going to have to happen there and you want to have that team prepared to move quickly.

 

Q: Does the migration tool estimate the cost of running in Splunk Cloud?

A:  It provides some aspect of the cost. You'll need to work with your Sales team to get accurate quoting. The tool can not provide an accurate license quote.

 

Q: Where were the lessons learned?? 

A: Biggest lessons learned are: - document everything you currently have - does it really need to migrate? - if you're doing data migration, make sure to balance the bucket counts before and after - document everything

 

Q: Which version of the cloud backend are you on? Do you have experience with AWS or GCP or something else? We have found that Victoria is not available on GCP.

A: We see a lot of AWS environments. We have worked with some GCP, but very few these days. Victoria is not on GCP and I believe ACS is limited there as well. I'd recommend checking the Service Limits before selecting an environment. Also, ingestions charges are something consider when selecting that environment. Good luck. 

 

Q: Is there a limit to how much indexed data can be exported to Splunk Cloud?

A: In theory no. You can send as much to Splunk Cloud as your license allows. Splunk Cloud will not meter the ingestion, however, expect overage charges when it comes to your contract.

 

Q: Are Parallel Migrations Supported (in order to avoid system downtime and to ensure that the new environment meets all existing requirements prior to cutover from the legacy on-premises environment) ?

A: When it comes to Splunk on-Prem to Splunk cloud migrations - YES! And it's recommended. We call this our rehydration approach as we do not like to migrate on-prem data to Splunk Cloud directly. Splunk Cloud leverages SmartStore and if you're not in SmartStore on-prem, it's a very painful process. I will say, work with the Sales team to ensure you have your licensing figured out. We typically run Parallel Migrations for about 3 months to ensure we hydrate the cloud environment with enough to be useful. This is made very easy with outputs.conf and the Deployment Server.

 

Q: Do you have advice on what to do when you have a TA that is not supported in SplunkCloud installed on your on-prem?

A: Is it for inputs or parsing?

 

Q: What did you decide to do with data that existed prior to the migration?

A: 

 

Q: Lessons learned?

A: Biggest lessons I have learned are: - There is often lots of cruft/unused data/KOs in on-prem. Do they need to be migrated? - Make sure and do your bucket inventories before and after - Find the most efficient ways to get your data into Splunk Cloud

 

Q: I'm concerned about Data egress costs for logs. If we use Splunk cloud we will have to start paying by my estimation 5TB/day will cost $180k/year. Is this something your customers have experienced?

A: This is a concern. SCMA does have a tab to help calculate. Can you tell me what you are egressing from? Guessing it is something other than AWS.  Inputs can usually go to an on-prem HF. Parsing ones require reverse engineering. Wish there was a better answer.

Get Updates on the Splunk Community!

Splunk Training for All: Meet Aspiring Cybersecurity Analyst, Marc Alicea

Splunk Education believes in the value of training and certification in today’s rapidly-changing data-driven ...

Investigate Security and Threat Detection with VirusTotal and Splunk Integration

As security threats and their complexities surge, security analysts deal with increased challenges and ...

Observability Highlights | January 2023 Newsletter

 January 2023New Product Releases Splunk Network Explorer for Infrastructure MonitoringSplunk unveils Network ...