Data Migration in Dynamics 365 — the comprehensive guide to secure, clean and efficient ERP migration projects – Part 1
- sabineknoll3
- Sep 17
- 3 min read
Updated: Sep 25

Data migration is much more than a technical copying process: it determines whether your new ERP system delivers reliable information right from the start or whether you have to spend days after go-live processing data cleansing, postings and support tickets.
What is data migration — and how does it differ from integration or interface work?
Data migration refers to the process of transferring existing data from one or more source systems to a new target system. In the ERP context, this means that master data (customers, suppliers, articles), open documents (orders, incoming invoices), inventories, opening accounting positions and often also some of the historical transactions are transferred. Unlike ongoing integrations (ETL/ELT processes or EDI connections), which synchronise or exchange data on a permanent basis, migration is usually a one-off or time-limited project with clear cutover windows, tests and a defined completion date. Migration is therefore a transformation project: data must be extracted, cleaned, transformed, validated and loaded into the target system.
Why is well-planned data migration particularly important for Dynamics 365?
Dynamics 365 Finance & Supply Chain Management is a comprehensive, configurable system that relies on consistent master data and valid references. Incorrect article master data, invalid charts of accounts or incompletely transferred open items directly lead to posting errors, missing orders or incorrect inventory values. In addition, many companies operate Dynamics 365 as Cloud SaaS: direct SQL access to the production database is often not available, which means that only supported methods (data entities, data management framework, APIs, dual write/dataverse) may be used. Therefore, a structured approach is essential — from selecting the data to be migrated to mapping and validation after loading.
Which data should be migrated — and which is better left archived?
Not all data necessarily needs to be migrated in full detail. A sensible selection is based on usability in day-to-day business and regulatory requirements. Typically, the minimum data sets include:
Master data: customers, suppliers, articles, accounts, price matrices, storage locations.
Active open transactions: open orders, open purchase orders, open supplier invoices, payment allocations.
Opening balances and balance sheet/invoicing as of the migration date.
Historical transaction data (e.g. all booking lines from previous years or completed sales years) is often not completely transferred to operational tables, but rather transferred to a reporting archive or data warehouse. Many teams opt for a hybrid solution: only the last 2–3 years of operational transactions are migrated, while older data remains in the archive and is available via reporting solutions or a read-only system. It is crucial that the chosen strategy takes into account audit compliance, reporting requirements and operational necessities.
Preparation: steps to take before the first data transfer
Successful migrations do not start with the first export, but with a systematic analysis:
First, a data inventory is necessary: identify all relevant data sources, document the existing data schemas and assign responsibilities (data owners). Data profiling and quality assessments are carried out in parallel: How many duplicates are there? How many mandatory fields are missing? Which fields deviate structurally from the target?
Based on this analysis, a migration concept is developed that describes the scope, migration rules, transformations, validation rules and acceptance criteria. Governance issues are clarified: Who decides in the event of conflicting master data? Which data is allowed in the target system? Which test data and masking are necessary for sensitive information?
Strategic decisions: Big Bang vs. phased migration
There are usually two models to choose from when it comes to migration strategy. With Big Bang, all relevant data is loaded into the new system within a narrowly defined cutover window and the old systems are shut down. Advantage: less dual operation, clear separation. Disadvantage: very high pressure on testing, great need for coordination and greater risks at go-live.
Phased or trickle migration transfers data step by step — for example, first master data and open items, then historical data or specific modules. Advantage: lower risks per release, opportunity to stabilise processes and train the team. Disadvantage: longer dual operation and more complex synchronisation requirements. Which strategy makes sense depends on business processes, regulatory requirements, data volume and risk tolerance.
Technical tools and concepts for Dynamics migrations
Microsoft provides several native options for Dynamics 365, which are often combined with external ETL tools in practice:
The Data Management Framework (DMF) is the central, integrated tool in Dynamics 365 for importing and exporting large amounts of data. It uses so-called data entities (predefined or project-specific entities) with which data can be packaged and loaded in data packages (ZIP). For complex transformations or when data comes from many heterogeneous sources, supplementary tools are often used.
Mapping & transformation — the real work
Mapping is at the heart of every migration: fields from the legacy system must be assigned to an entity and a field in Dynamics 365. Common challenges include different field formats (e.g. date formats, units of measurement), different encodings (item numbers with leading zeros), missing references or business logic deviations (e.g. different posting rules). Good practice is to define mapping rules in a traceable document, maintain lookup tables for reference data, and build transformations in a modular way so that adjustments can be quickly implemented in test runs.
A typical mistake is to code mapping and transformations ‘ad hoc’ in the load job. It is better to create a reusable mapping artefact that is tested, versioned and transparently documented. Automated unit tests for transformations (e.g. generating and comparing expected target data from test data sets) significantly reduce risks.
Data cleansing: Clean up BEFORE loading
Most migration projects fail not because of the loading itself, but because of poor data quality. Data cleansing is a key prerequisite: removing duplicates, filling in mandatory fields, archiving outdated or unused master data, standardising formats (addresses, telephone numbers) and comparing against reference lists (e.g. valid product groups, tax codes).
Data profiling tools provide insight into the distribution of values and frequent anomalies. The cleansing must be coordinated with the relevant departments: Which duplicates can be deleted, which can be merged, and how do you retain the history? A frequently chosen approach is to form a dedicated cleansing team of subject matter experts and data engineers, define clear rules, and apply them already in the test migrations.
How the go-live should proceed and what you should definitely pay attention to during and after it is explained in part 2 of our blog.




Comments