Scroll
Contact us

Our clients tell us that we are open, honest and approachable. Please do not hesitate to contact us, we will respond to all enquiries.

4 top tips for data migration projects 17 DECEMBER 2018

4 top tips for data migration projects

Transferring data between computer systems or storage formats is never a minor task.

The complexity of data migration jobs usually means two things are likely to happen: costs will overrun and there will be delays with “go-lives”. In order to help minimize the probability of these things, some of the key questions companies need to ask themselves before taking any actions towards migration process would be:
• How much they know their own data?
• How has their data mapping specification been done?
• Have they considered all aspects and risks of moving their data to the target location?

Businesses should ensure they are practicing certain disciplines in their data development and that the plans, policies and practices which support them in protection, control, delivery and enhancement of their data quality and value are well executed and supervised.

Here are some tips which can help the process run as smoothly as possible:

1. Make sure you have data governance policies in place

All it takes is to ensure that there is the right framework in place! A good quality data governance framework should be robust enough to handle every part of data management as well as data quality process. There are three main elements within any data governance policy: people, processes and technologies.

While these three are not mutually exclusive the data governance strategies and policies should mainly focus on making a right decision on who has the authority to create, approve, modify or to remove data from systems.

Having the correct strategy and framework makes these goals achievable. What we strongly recommend is the formation of a council as data governance requires ongoing monitoring to ensure the continuous improvement in data management and quality.

2. Follow best practices for data mapping

Taking the above into consideration, it’s vital to accept that one of the key elements of a successful migration is ensuring the accuracy of the data analysis, so that each piece of required information ends up in the most appropriate place and in the required format in the target data repository.

Having a clear understanding on how to define the data mapping specification is vital as it ensures the source data fits the target accurately. This can be achieved by converting mapping specification into migration code and by ensuring that the code is verified to identify errors in testing environments.

The next step is to adapt an integrated solution which usually is a combination of load ETL tools, transform and data quality tools which helps to restructure data for targeted delivery.

We recommend that companies exercise these data management best practices as they bring tremendous value to the data mapping process.

3. Poor data quality can lead companies to disaster!

It’s vital to understand the key role that data quality plays as it is core to data migration. Let’s think of a very common situation during migration where data quality issues exist within legacy systems. This will cause difficulties when trying to make applications interact or when giving them a common definition or structure.

As it is often common for the destination to have a firmer data model or the fact that it might be using a broader set of data due to its enhanced features, it is also very likely that bad quality data within the legacy system which hasn’t caused any issue, becomes a problem within the new system.

There are also other issues which may occur as a result of poor data quality:
• Extending the project timeline as reconciling migrated data needs additional time and resource
• Loss of reliability of the migration system to the team and customer
• Customer dissatisfaction due to inaccurate or duplicate data
• Compliance issues
• System integration issues

These scenarios could be prevented by completing a data quality audit to ensure data within the legacy system is in a good condition. As more detailed features are created and understanding of both the source and target systems evolves, further assessment is required to validate data quality on migration effort. The audit consists of the following common activities:

4. They are all heroes, but they may not be the best to do the job!

The fact that users are creators or consumers of data doesn’t necessarily make them an expert candidate to be involved in the migration process. It’s always important to consider that they are not entirely familiar with best practices when it comes to using tools and the required processes or services.

It is vital for businesses to identify and assign the most knowledgeable people to avoid any data loss or misrepresentation. This allows them to manage data content and definition across the organisation by simplifying data translation and reducing system complexity.

Arum has years of experience in this area and should you need any assistance with migration of your systems, please get in touch!

Newsha Nosrati
Senior Analyst
Arum

Request a Callback