Migrating from an on-premises Azure DevOps Server to Azure DevOps Services in the cloud can seem daunting, but with the right preparation and an understanding of key concepts, especially around Azure AD identities, you can streamline the process. This blog walks through a full end-to-end migration: Validate, Prepare, DACPAC extraction, SAS storage upload, and Import. Along the way, we highlight why Azure Entra (Azure AD) accounts are essential and how to map on-prem users properly.
Introduction and Prerequisites
Why migrate?
- Leverage cloud scalability, built-in integrations, and reduced on-prem maintenance.
- Access advanced features and global collaboration in Azure DevOps Services.
Prerequisites:
- Azure DevOps Server running a supported version (e.g., 2022.1+). Ensure you’ve applied updates so import support is available.
- SQL Server hosting your collection databases (e.g., AzureDevOps_DefaultCollection).
- Azure subscription with rights to create resources (storage account) and an Azure AD tenant associated with your Azure DevOps Services.
- Azure Entra (Azure AD) users: You need at least one Azure AD user in the target tenant to run the import (owner). For mapping, every on-prem user you want active in cloud must have an Azure AD account or guest in the tenant.
- Data Migration Tool downloaded and extracted on the Azure DevOps Server application tier machine.
- SqlPackage.exe (from DacFx) available to extract DACPAC from the detached collection DB.
- AzCopy or Storage Explorer to upload files to Azure Blob Storage.
High-level workflow:
- Validate the on-prem collection for import readiness.
- Prepare the collection to generate metadata and mapping files.
- Detach the collection from Azure DevOps Server so the database is quiesced/inactive and contains detach markers.
- Extract DACPAC from the detached collection database.
- Upload DACPAC and mapping files to an Azure Storage container using a SAS token (with Read+List permissions).
- Edit and review import.json, specifying the SAS URL and target organization name.
- Import into Azure DevOps Services under an Azure AD account.
- Post-import validation: verify repos, work items, pipelines, permissions, etc.
Throughout, we emphasize Azure AD identity requirements: the import tool itself authenticates with Azure AD, the owner must be an Azure AD user in the target tenant, and identity mapping must point to real Azure AD UPNs so that history is mapped to active users rather than orphaned historical identities.
Validate the Collection
Why Validate?
- Checks whether the collection is eligible for import (schema, data, feature compatibility).
- Generates guidance on any issues to fix before prepare.
Command used:
.\Migrator.exe validate \
  /collection:”http://ec2amaz-61dqj8s/DefaultCollection” \
  /tenantDomainName:”amannextgensoft.onmicrosoft.com” \
  /region:”CUS” \
  /output:”C:\Users\Administrator\Desktop\DataMigrationTool\ValidateOutput”
- /collection: URL of the on-prem collection.
- /tenantDomainName: your Azure AD tenant domain (e.g., amannextgensoft.onmicrosoft.com), so identity mapping can reference Azure AD.
- /region: target Azure region shorthand (e.g., CUS for Central US).
- /output: local folder for logs and guidance.
Interpretation:
- The tool prompts about preview features (Analytics, etc.) and whether to include them in import.
- Identity validation: if on-prem users aren’t found in Azure AD, it warns that they’ll become historical. You may choose to continue or configure Azure AD Connect if you want active mapping.
- If validation passes all steps, move to Prepare.
Prepare the Collection
Why Prepare?
- Generates import files: identity mapping suggestions, process templates, work item type definitions, etc.
- Runs a full validate first; if validation fails, it aborts.
Command used:
.\Migrator.exe prepare \
   /collection:”http://ec2amaz-61dqj8s/DefaultCollection” \
   /tenantDomainName:”amannextgensoft.onmicrosoft.com” \
   /region:”CUS” \
   /output:”C:\Users\Administrator\Desktop\DataMigrationTool\PrepareOutput”
- After prompting about preview features, it validates and then generates import files in the output folder.
- Key generated file: import.json, containing placeholders for SAS location, organization name, import type, plus ValidationData and Identities sections capturing metadata from prepare.
Review Prepare Output:
- identity-mapping CSV: lists on-prem SIDs and suggests mappings to Azure AD UPNs (if found). Edit this file to map each on-prem user to the correct Azure AD account in your tenant. If a mapping target doesn’t exist yet in Azure AD, create or invite that user in Azure AD first.
- import.json: contains:
- Source: placeholder for SAS location.
- Target: placeholder for organization name.
- Properties.ImportType: placeholder (DryRun or ProductionRun).
- ValidationData: metadata captured during prepare; do not modify unless needed.
- Identities: list of on-prem identity SIDs; used during import to map historical entries.
Detach the Collection
Why Detach?
- Import requires a DACPAC from a detached collection database so that Azure DevOps Server has quiesced activity, preserved metadata, and marked the database for import. If the database remains attached/live, import fails with VS403250.
Steps to Detach:
- In Azure DevOps Server Administration Console, navigate to Team Project Collections, select your collection (e.g., DefaultCollection), and choose Detach Collection. Confirm quiesce and detach.
- Alternatively, use TFSConfig:
TFSConfig Collection /detach /collectionName:DefaultCollection /serverUrl:http://ec2amaz-61dqj8s
- Wait for the process to complete. The DB (AzureDevOps_DefaultCollection) now contains detach markers.
- Ensure no jobs/services are running for that collection.
Note: Keep the detached database in place; do not delete or drop it.
Extract the DACPAC
Why extract?
- The import service requires a DACPAC representing the detached collection database schema and data (with table data). This file is uploaded to Azure Storage for import.
Ensure permissions to SQL Server:
- The account running SqlPackage.exe (e.g., the Windows Administrator or a SQL login) must have db_owner rights on the detached database.
- If using Windows Authentication, add EC2AMAZ-61DQJ8S\Administrator as a SQL login mapped to the AzureDevOps_DefaultCollection database with db_owner role.
- Or create a SQL login (Mixed Mode) and grant it db_owner on the DB, then supply /SourceUser and /SourcePassword to SqlPackage.exe.
- Include /SourceTrustServerCertificate:True if the SQL instance uses a self-signed or untrusted cert.
SqlPackage Extract Command:
& “C:\Users\Administrator\Downloads\sqlpackage-win-x64-en-170.0.94.3\sqlpackage.exe” /Action:Extract \
  /SourceServerName:”EC2AMAZ-61DQJ8S\SQLEXPRESS” \
  /SourceDatabaseName:”AzureDevOps_DefaultCollection” \
  /TargetFile:”C:\Users\Administrator\Desktop\DataMigrationTool\PrepareOutput
  \Logs\DefaultCollection\<timestamp>\AzureDevOps_DefaultCollection.dacpac” \
  /SourceTrustServerCertificate:True \
  /p:ExtractAllTableData=true \
  /p:IgnoreUserLoginMappings=true \
  /p:IgnorePermissions=true \
  /p:CommandTimeout=0
- /Action:Extract reads the schema and table data into a DACPAC.
- /p:ExtractAllTableData=true ensures table data is included.
- /p:IgnoreUserLoginMappings and /p:IgnorePermissions=true skip irrelevant on-prem SQL logins and permissions.
Verify the resulting .dacpac exists and is non-zero.
Upload DACPAC and Mapping Files to Azure Storage
Why Azure Storage?
Azure DevOps import service retrieves the DACPAC and mapping files from a publicly reachable location, your Azure Blob Storage using a SAS token.
Step 1: Create a Storage Account & Container
- Go to the Azure Portal: https://portal.azure.com
- Create a new Storage Account:
- Click “Create a resource” → “Storage account”
- Choose your Subscription, Resource Group, and set a Storage account name
- In “Networking”, choose “Public endpoint (all networks)”
- Leave other settings at defaults or adjust as needed
- Click “Review + Create” → Create
- Create a Blob Container:
- Go to the newly created Storage Account
- In the left menu, click “Containers”
- Click “+ Container”
- Name it something like migrate-data-azure-server
- Set Public access level to Private (no anonymous access) (we’ll use SAS for access)
- Click “Create”
Step 2: Generate a SAS Token
- In the Storage Account, go to “Shared access signature” (left-hand menu).
- Under Allowed services, select:
- Under Allowed resource types, select:
- Under Allowed permissions, select:
- Set:
- Start time: a few minutes in the past
- Expiry time: e.g., 7 days in the future
- Click “Generate SAS and connection string”
- Copy the Blob SAS URL — it will look like:
https://<account>.blob.core.windows.net/migrate-data-azure-server?sv=…&sp=rl&st=…&se=…&sig=…
Step 3: Upload Files via Azure Portal UI
- Go to your Storage Account → Containers → migrate-data-azure-server
- Click on the container name to open it
- Click “Upload” at the top
- In the file picker:
- Select the AzureDevOps_DefaultCollection.dacpac file
- (Optional) Select the identity-mapping CSV file if required
- Click Upload
Step 4: Verify Upload (List Files)
- In the container view (migrate-data-azure-server), you will see the list of uploaded blobs (files).
- Confirm that your .dacpac and any .csv files appear there.
Edit import.json and Identity Mapping
Review import.json generated by Prepare (path example: …\PrepareOutput\Logs\DefaultCollection\20250616_105235\import.json):
{
  “Source”: {
    “Location”: “<Provide the SASKey to the Azure storage container with the collection and import files.>”,
    “Files”: { “Dacpac”: “AzureDevOps_DefaultCollection.dacpac” }
  },
  “Target”: {
    “Name”: “<Provide a name for the organization that will be created during the import.>”
  },
  “Properties”: {
    “ImportType”: “<DryRun or ProductionRun>”
  },
  “ValidationData”: { /* metadata from prepare */ },
  “Identities”: [ /* on-prem SIDs */ ]
}
Update Source.Location:
Set to container-level SAS URL (with sp=rl):
“Source”: {
  “Location”: “https://<account>.blob.core.windows.net/migrate-data-azure-server?sv=…&sp=rl&st=…&se=…&sig=…”,
  “Files”: { “Dacpac”: “AzureDevOps_DefaultCollection.dacpac” }
},
Update Target.Name:
- Choose a new, unused organization name in Azure DevOps Services, e.g., migrateorg345678-dryrun for testing. The import service will create this org; do not pre-create an org with that name.
- After a successful dry run, for production use a final name like migrateorg345678 (ensure it’s available). If you previously created an empty org with that name, delete it first so import can create it.
Set ImportType:
For initial test:
“Properties”: {
  “ImportType”: “DryRun”
},
For final production:
“Properties”: {
  “ImportType”: “ProductionRun”
},
Identity mapping:
If Prepare created an identity-mapping CSV, ensure each on-prem user is mapped to a valid Azure AD UPN in your tenant. For example:
“S-1-5-21-…-500″,”aman@amannextgensoft.onmicrosoft.com”,”Aman Mishra”
- In import.json, the Identities array typically contains SIDs; the import service uses mapping CSV or Azure AD lookups to resolve those to actual Azure AD accounts. If the mapped Azure AD account does not exist, the identity is imported as historical (viewable in history but not active until you add the user later).
- Key: ensure mapped UPNs are real Azure AD accounts (Member) in the tenant.
Save the updated import.json.
Authentication: Azure AD Account and PAT
Why Azure AD?
- Azure DevOps Services uses Azure AD for identity. The import operation must authenticate with an Azure AD user in the target tenant, since that user becomes the org owner.
- Personal Microsoft Accounts (MSAs) cannot serve as import-run identities in Azure AD-backed orgs.
Generate a PAT under an Azure AD user:
- In Azure DevOps Services, sign in using your Azure AD account context (choose “Default Directory” not “Microsoft account”).
- Go to User Settings → Personal Access Tokens → New Token.
- Select scopes needed (e.g., “Manage” or specific migration scopes if documented). Copy the PAT.
- When running Migrator.exe import, use this PAT. If prompted interactively, sign in; or set environment variable AZURE_DEVOPS_EXT_PAT or similar if supported.
Run the Import
Dry Run Import Command:
.\Migrator.exe import /importFile:”C:\Users\Administrator\Desktop\DataMigrationTool\PrepareOutput
  \Logs\DefaultCollection\20250616_105235\import.json”
- The tool validates the import file (including Source.Location accessibility, target name availability, identity mapping).
- It prompts to confirm including data staging for up to 7 days.
- It warns if no previous dry run exists (for production). For DryRun, confirm.
- On success: returns “Import has been successfully started!” and provides:
- Monitor URL: https://dev.azure.com/<YourOrgName>
- Import ID.
Monitor Progress:
- Open the provided URL in a browser, sign in as the Azure AD import user. Under Organization Settings → Migration (or the monitor link), check status.
- Import can take time depending on size and included services (e.g., Repos, Work Items, Pipelines, Test Plans, Artifacts, Analytics).
Production Import:
- After DryRun validation, quiesce on-prem collection again, detach, extract fresh DACPAC, upload, update import.json:
- Target.Name: final desired org name (unused at that moment).
- Properties.ImportType: “ProductionRun”.
- Run import with PAT under Azure AD user. Confirm warnings about rollback. Monitor.
Post-Import: Validate and Configure
After import completes successfully:
- Verify data:
- Repos: branches, commits, PRs, branches policies (you may need to reconfigure build pipeline triggers or branch policies).
- Work Items: boards, backlog, queries, history (mapped to Azure AD users).
- Pipelines: review build/release pipelines. You’ll likely need to reconfigure agent pools (set up new self-hosted agents or use Microsoft-hosted agents) and service connections (e.g., Azure subscriptions, Docker registries).
- Test Plans & Artifacts: check test suites/results and package feeds; reconfigure if needed.
- Analytics/Dashboards: if included, verify dashboards and reports.
- Add users:
- Invite team members in Organization Settings → Users, using their Azure AD accounts. Assign access levels (Basic, Stakeholder) and project-level permissions.
- Historical identities are present in history but not active; map or invite users as needed so they can continue work.
- Extensions and integrations:
- Reinstall or configure marketplace extensions, service hooks, and integrations (e.g., Slack, Teams, webhooks).
- Verify security and compliance:
- Check organization policies, branch protections, permissions.
- Decommission on-prem:
- After confirming successful cutover, retire or archive the on-prem Azure DevOps Server. Keep a backup of the detached DB for archival.
- Billing and capacity planning:
- Configure billing for Azure DevOps Services: user licenses, parallel jobs, storage.
Deep Dive: Why Azure AD Accounts Are Mandatory
During Prepare, you supply an identity-mapping CSV mapping on-prem SIDs to target identities. But:
- Mapping vs. actual identity existence: The mapping file tells import how to label historical data, but if the mapped UPN isn’t present in Azure AD, import treats that user as historical (no active access). To have active users in the cloud org, each must have a real Azure AD account in the tenant.
- Authentication for import: The import tool must authenticate to Azure DevOps Services to create a new organization and push data. Only Azure AD users in the tenant can sign in or use PATs for this operation. MSAs or on-prem Windows accounts cannot authenticate to the cloud org context.
- Organization ownership: The Azure AD account running import becomes the initial owner/administrator. Azure DevOps Services must be able to find that account in Azure AD to assign org-level rights.
- Security model: Azure DevOps Services relies on Azure AD for access control, licensing, MFA, conditional access, and auditing. All users must exist in Azure AD for consistent governance.
- Guest accounts: Invited guests (#EXT#) can be used but may complicate mapping; best to use Member accounts with clear UPNs. If a user doesn’t exist in Azure AD, create/invite them before import so they can be mapped to active accounts.
Analogy: Think of Azure AD as the cloud org’s “address book.” During import, you need a valid entry (Azure AD account) for:
- Who is performing the import and owns the new org (import-time login).
- Who each on-prem user maps to in history (mapping to an address book entry ensures the right person sees notifications and has access).
Without these, import can only label entries as “historical” or fail authentication.
Commands Recap
- Below are the core commands used in this migration with explanations:
- Validate:
.\Migrator.exe validate \
/collection:”http://ec2amaz-61dqj8s/DefaultCollection” \ /tenantDomainName:”amannextgensoft.onmicrosoft.com” \
/region:”CUS” \
/output:”C:\Users\Administrator\Desktop\DataMigrationTool
           \ValidateOutput”
Checks collection readiness, identity lookup in Azure AD, feature compatibility.
-
- Prepare:
.\Migrator.exe prepare \
/collection:”http://ec2amaz-61dqj8s/DefaultCollection” \Â /tenantDomainName:”amannextgensoft.onmicrosoft.com” \
/region:”CUS” \
/output:”C:\Users\Administrator\Desktop\DataMigrationTool
           \PrepareOutput”
Generates import.json, identity-mapping CSV, process templates and other artifacts.
- Detach:
- Via Admin Console or:
TFSConfig Collection /detach /collectionName:DefaultCollection /serverUrl:http://ec2amaz-61dqj8s
- Ensures the DB is quiesced and marked for import.
- Extract DACPAC:
& “C:\path\to\sqlpackage.exe” /Action:Extract \Â /SourceServerName:”EC2AMAZ-61DQJ8S\SQLEXPRESS” \
/SourceDatabaseName:”AzureDevOps_DefaultCollection” \Â /TargetFile:”C:\…\AzureDevOps_DefaultCollection.dacpac” \
/SourceTrustServerCertificate:True \
/p:ExtractAllTableData=true \
/p:IgnoreUserLoginMappings=true \
/p:IgnorePermissions=true \
/p:CommandTimeout=0
- Edit import.json:
- “Source.Location” = container SAS URL with sp=rl
- “Target.Name” = new unused org name (e.g., migrateorg345678-dryrun)
- “Properties.ImportType” = “DryRun”
- Ensure identity mappings point to valid Azure AD UPNs.
- Import:
.\Migrator.exe import /importFile:”C:\Users\Administrator\Desktop\DataMigrationTool\PrepareOutput
       \Logs\DefaultCollection\<timestamp>\import.json”
-
- Authenticate with PAT under an Azure AD account in the target tenant.
- Confirm prompts and monitor via the provided URL.
Conclusion
Migrating to Azure DevOps Services involves multiple steps: validating and preparing the on-prem collection, ensuring Azure AD identity readiness, detaching and extracting the database, uploading artifacts to Blob Storage with proper SAS, and running the import under an Azure AD account. Understanding why Azure AD identities are mandatory is crucial: the cloud service requires valid Azure AD accounts for authentication, ownership, and mapping historical data to active users.
By following this guide and verifying each step, Validate, Prepare, Detach, Extract DACPAC, Upload to Storage, Edit import.json, and Import under Azure AD, you can perform a smooth migration. After import, validate your repos, work items, pipelines, and configure users and permissions in the new organization.
Feel free to adapt this blog for your audience, include screenshots of the commands and Azure Portal steps, and share lessons learned about identity mapping and Azure AD integration. Happy migrating to the cloud!