The testing part of the process is absolutely crucial, so make sure that you’ve devoted sufficient time for it in your plans.
The first step is to build a thorough test plan, detailing what you want to test, why it needs to be tested, how you’ll perform it and what resources you’ll use to do it.
Be sure to consider:
- Code testing – this will ensure that specific elements of the code are performing in the way they’re designed to.
- Process testing – to test the performance and accuracy of your migration controller and migration processes. This is a test of the full end-to-end process and how well your code knits together as a full application.
- Coverage loads – In essence, use a subset of data, definitely no more than 10,000 rows. This doesn’t need to be accurate, in short. But it simply tests if you can physically load data into your target application. All by way of using the application load utility or third party tool.
- Volume loads – In short, you complete a coverage load and develop the majority of the target application. Accordingly, you should run a volume load, where you attempt to load as much of the final file as possible. Again, at this stage, accuracy is not critical. In essence, you’re looking to estimate load times and test assumptions around orchestration. At first, it’s normal to see rejections of rows, failures and exceptions when you run this. But it will give you a good idea of what improvements need to be made.
This will have been performed throughout the code testing phase, but it is useful to run a macro view of reconciliation. Here you should be looking at the data that didn’t make it through your Data Quality Records (DQRs) or couldn’t be loaded into the target application.
As part of the project planning you should have identified a ‘migration window’; where you’re able to execute the migration without bumping into critical business processes. Performance tuning gives you the ability to optimise your code and the process so that you can comfortably run your migration within that window.
Don’t be afraid to lean on your vendor at this stage for help and advice. If you notify them of your performance tests in advance, they’re sometimes able to commission more representative environments you can use for testing.
If you’ve been performing regular volume and reconciliation tests and performance tuning as you go, you shouldn’t have too many surprises during end-to-end testing. However, it’s still extremely sensible to recreate the planned ‘go-live’ scenario.
It’s also important that you include User Acceptance Testing (UAT). It can be beneficial to have users prepare scenarios that recreate what they’re likely to be doing in the new application and try them out.
Then, run end-to-end testing in conjunction with UAT as if you were performing the real migration to discover if things run as smoothly as you’d hoped. Ensure you have a workflow to manage any data bugs so migration developers and project managers have full visibility.
Deliver Successful Data Migration
Despite all your careful planning and preparation, this part can still be nerve-wracking. Monitoring tools and dashboards can be useful to keep abreast of the process. Complete with milestones that turn green as you achieve them, in addition to a full configuration of notifications.
Finally, don’t go live without a contingency plan. Thus, you can ensure you’re able to fall-back onto your old infrastructure should anything go wrong.
Post go-live support
Make sure that you have post go-live support in place for the data migration as well as the new application. Ideally, you need someone who intrinsically understands the DQRs and migration controller. Moreover, one ready for the totality of your legacy data – one of your migration developers would be perfect.
It’s probable that your application is a significant focal point for wider business operation and processes. Additionally, with that comes a need to integrate your application with other technologies. This in itself is a distinct workstream to address, and you should account for it during your landscape analysis. When you are considering integration, it’s likely that there are lots of off-the-shelf integrations you can use. In contrast, you may be able to use the ETL (extract, transform, load) tool you built your migration controller in. In addition, along with much of the transformation and load code.
For more depth and insight on the data migration process, simply download a copy of our free whitepaper. Or, if you’ve hit a roadblock (or two!) with your project, don’t hesitate to get in touch.
More from Qbase
A Guide for Data Migration Success
Did you know that according to Experian, due to issues with data migration, only 46% of all new database implementation are delivered on time? Or that an incredible 74% of projects go over budget?