Planning a successful data migration – part two

If you’re reading this blog with the view to carry out the ‘test’ and deliver’ stages of your data migration, you should have already completed the ‘discover’ and ‘design’ stages detailed in part one of this blog. Part two will focus on the thorough testing procedures that need to be carried out right up to your ‘go live’ date. You’re nearly there! 

TEST

The testing part of the process is absolutely crucial, so make sure that you’ve devoted sufficient time for it in your plans. 

The first step is to build a thorough test plan, detailing what you want to test, why it needs to be tested, how you’ll perform it and what resources you’ll use to do it.

Be sure to consider: 

  • Code testing – this will ensure that specific elements of the code are performing in the way they’re designed to. 
  • Process testing – to test the performance and accuracy of your migration controller and migration processes. This is a test of the full end-to-end process and how well your code knits together as a full application.
  • Coverage loads – Use a subset of data – definitely no more than 10,000 rows. This doesn’t need to be accurate but is used simply to test that you can physically load data into your target application using the application load utility or third party tool. 
  • Volume loads – Once you’ve completed a coverage load and the majority of the target application is developed, you should run a volume load, where you attempt to load as much of the final file as possible. Again, at this stage the data doesn’t need to be accurate. You’re looking to estimate load times and test assumptions around orchestration. It’s normal to see rejected rows, failures and exceptions when you run this for the first time but it will give you a good idea of what improvements need to be made.

Reconciliation testing

This will have been performed throughout the code testing phase, but it is useful to run a macro view of reconciliation. Here you should be looking at the data that didn’t make it through your Data Quality Records (DQRs) or couldn’t be loaded into the target application.

Performance tuning

As part of the project planning you should have identified a ‘migration window’; where you’re able to execute the migration without bumping into critical business processes. Performance tuning gives you the ability to optimise your code and the process so that you can comfortably run your migration within that window.  

Don’t be afraid to lean on your vendor at this stage for help and advice. If you notify them of your performance tests in advance, they’re sometimes able to commission more representative environments you can use for testing.

End-to-end testing

If you’ve been performing regular volume and reconciliation tests and performance tuning as you go, you shouldn’t have too many surprises during end-to-end testing. However, it’s still extremely sensible to recreate the planned ‘go-live’ scenario.

It’s also important that you include User Acceptance Testing (UAT). It can be beneficial to have users prepare scenarios that recreate what they’re likely to be doing in the new application and try them out.

Then, run end-to-end testing in conjunction with UAT as if you were performing the real migration to discover if things run as smoothly as you’d hoped. Ensure you have a workflow to manage any data bugs so migration developers and project managers have full visibility.

DELIVER

Go-live 

Despite all your careful planning and preparation, this part can still be nerve-wracking. Monitoring tools and dashboards can be useful to keep abreast of the process, with milestones that turn green as they’re achieved and fully-configured notifications.

Finally, don’t go live without a contingency plan and ensure you’re able to fall-back onto your old infrastructure should anything go wrong. 

Post go-live support

Make sure that you have post go-live support in place for the data migration as well as the new application. Ideally, you need someone who intrinsically understands the DQRs and migration controller and has been exposed to the totality of your legacy data – one of your migration developers would be perfect.  

Technology integration

It’s probable that your application is a significant focal point for wider business operation and processes, and with that comes a need to integrate your application with other technologies. This in itself is a distinct workstream that needs to be addressed, and should be taken into account during your landscape analysis. When you are considering integration, it’s likely that there are lots of off-the-shelf integrations you can use, or alternatively, you may be able to use the ETL (extract, transform, load) tool you built your migration controller in, along with much of the transformation and load code. 

For more depth and insight on the data migration process, simply download a copy of our free whitepaper. Or, if you’ve hit a roadblock (or two!) with your project, don’t hesitate to get in touch.

A Guide for Data Migration Success

Did you know that according to Experian, due to issues with data migration, only 46% of all new database implementation are delivered on time? Or that an incredible 74% of projects go over budget?