The most recent figures suggest that nearly 40% of data migration projects end up being over deadline, over budget or failing entirely. More often than not, this is because there isn’t a solid framework in place that futureproofs and adds structure to the project. At Qbase, we use Jonny Morris’ PDMv2 as the basis for our framework and we’ve adapted it over time with best practice in mind. This blog will share six golden rules that we think are vital to a successful data migration framework.
What is a data migration framework?
A data migration framework is simply a structured process used to deliver a data migration. It helps to deliver a clear roadmap and timeline for the project, as well as futureproofing the project for any unexpected roadblocks that may arise.
Here are our golden rules for creating a successful data migration framework:
1. Always start with a landscape analysis
First of all, you have to conduct your landscape analysis. As with any major data project, a landscape analysis ensures you have a handle on your data and on any processes that already exist in your organisation. It allows you to properly specify your new solution to accommodate existing data assets and core processes. And knowing the data you currently use in your business means you can accurately plan how to move it into your new solution.
2. Put business at the centre of your data migration
All too often, data migrations are seen as highly technical and therefore a job solely for tech-centred teams. But a data migration is not simply a technical project, it’s something that impacts the whole business. This means that if it goes wrong, the whole business feels the effects.
You need to put business engagement and support at the centre of your data migration framework. Bring in subject matter experts from your sales, finance and marketing teams from the get-go. The idea is that your whole business needs to take ownership of the project. They all must gain an understanding of historic data to help inform choices related to data selection, data preparation, data quality, and decommissioning.
3. Make sure you have the right skills for the job
A sole focus on tech skills for a data migration will lead to a communication barrier between the technical team and the wider business. Because of the increasingly digitalised world of business, a data migration is going to impact a range of people from across the business, meaning their input will be crucial and their buy-in to the project is fundamental to its success.
This is where business analysts come into their own – it’s their job to bridge the gap between IT and business. They should be able to understand the needs of the business at the same time as being able to interface effectively with technical team members when discussing solutions and the use of migration software. Business analysts require a clear understanding of formal processes so as not to operate in a reactive way, as well as strong project leadership skills to deliver the data migration on time and to budget.
4. Always plan to scale
Many businesses that start out on their data migration projects underestimate the enormity of the project and just how time-consuming it can be. This is especially true when it comes to the amount of data preparation activity required.
This is ultimately where experience comes in. Typically, a data migration will only happen once in a business lifetime, so internal staff won’t have had exposure to this kind of project. However, data services companies, like Qbase, deal with these projects on a daily basis. It can be easy to think we’re ‘over-egging’ our plans at the first stages but all to often we see customers grateful for this scope once we get going.
5. Assign clear responsibilities to avoid the responsibility gap
Data projects can bring up certain challenges that impact the wider business that are known as ‘semantic issues’. This is when there is a genuine disagreement about the definition of a business term or the use of fields in corporate systems. For example, when a customer’s data has been logged inaccurately, leaving them paying the incorrect amount. Semantic issues are near impossible for an IT team to rectify on their own. However, bringing them to a customer team to fix can be challenging as they will likely see them as a data-related problem, therefore not their responsibility. Hence the responsibility gap.
You need to try to anticipate these problems instead of trying to rectify them when they happen. Bringing in people from across the business in the first instance will help to establish responsibility. Business analysts can help here too by ensuring teams aren’t working in silos and instead, looking at the business as a whole. A Data Quality Rules (DQR) process will mean there is a board in place to review this, with representatives from all departments.
6. Make sure you have the right tool for your data migration
With so many data migration tools available, it can be difficult to know what to go with. Your project is only going to be as successful as your tool allows.
Talend is a powerful data management tool that enables a smooth extract, transform and load (ETL) for a data migration. It doesn’t matter if you’re moving to a new CRM or implementing a data warehouse, Talend will help you ETL your data, without the risk of data loss, into your new platform, meaning you can migrate with absolute confidence. What’s more, we are a Gold Talend partner! So, we can help you get the most out of this software by building custom instances of Talend to suit your business needs exactly.
Poor quality data can affect many different areas of your business. Efficiency can be stifled, reporting inaccurate,…
Managing your data can seem like a daunting task. Especially when it involves cleaning up a data…
How clear and consistent is the data your organisation relies on for decision making and planning? Is it…